Browse
···
Log in / Register

Sr GCP Engineer

Negotiable Salary

Tek Spikes

Springfield, MO, USA

Favourites
Share

Description

Infrastructure Automation & Management:   Design, implement, and maintain scalable, reliable, and secure cloud infrastructure using GCP services. Automate cloud infrastructure provisioning, scaling, and monitoring using Infrastructure as Code (IaC) tools such as Terraform or Google Cloud Deployment Manager. Manage and optimize GCP resources such as Compute Engine, Kubernetes Engine, Cloud Functions, and BigQuery to support development teams. CI/CD Pipeline Management: Build, maintain, and enhance continuous integration and continuous deployment (CI/CD) pipelines to ensure seamless and automated code deployment to GCP environments. Integrate CI/CD pipelines with GCP services like Cloud Build, Cloud Source Repositories, or third-party tools like Jenkins Ensure pipelines are optimized for faster build, test, and deployment cycles. Monitoring & Incident Management: Implement and manage cloud monitoring and logging solutions using Dynatrace and GCP-native tools like Stackdriver (Monitoring, Logging, and Trace). Monitor cloud infrastructure health and resolve performance issues, ensuring minimal downtime and maximum uptime. Set up incident management workflows, implement alerting mechanisms, and create runbooks for rapid issue resolution.   Security & Compliance:   Implement security best practices for cloud infrastructure, including identity and access management (IAM), encryption, and network security. Ensure GCP environments comply with organizational security policies and industry standards such as GDPR/CCPA, or PCI-DSS. Conduct vulnerability assessments and perform regular patching and system updates to mitigate security risks. Collaboration & Support: Collaborate with development teams to design cloud-native applications that are optimized for performance, security, and scalability on GCP. Work closely with cloud architects to provide input on cloud design and best practices for continuous integration, testing, and deployment. Provide day-to-day support for development, QA, and production environments, ensuring availability and stability. Cost Optimization: Monitor and optimize cloud costs by analyzing resource utilization and recommending cost-saving measures such as right-sizing instances, using preemptible VMs, or implementing auto-scaling.   Tooling & Scripting: Develop and maintain scripts (using languages like Python, Bash, or PowerShell) to automate routine tasks and system operations. Use configuration management tools like Ansible, Chef, or Puppet to manage cloud resources and maintain system configurations. Required Qualifications & Skills: Experience: 3+ years of experience as a DevOps Engineer or Cloud Engineer, with hands-on experience in managing cloud infrastructure. Proven experience working with Google Cloud Platform (GCP) services such as Compute Engine, Cloud Storage, Kubernetes Engine, Pub/Sub, Cloud SQL, and others. Experience in automating cloud infrastructure with Infrastructure as Code (IaC) tools like Terraform, Cloud Deployment Manager, or Ansible.   Technical Skills: Strong knowledge of CI/CD tools and processes (e.g., Jenkins, GitLab CI, CircleCI, or GCP Cloud Build). Proficiency in scripting and automation using Python, Bash, or similar languages. Strong understanding of containerization technologies (Docker) and container orchestration tools like Kubernetes. Familiarity with GCP networking, security (IAM, VPC, Firewall rules), and monitoring tools (Stackdriver). Cloud & DevOps Tools: Experience with Git for version control and collaboration. Familiarity with GCP-native DevOps tools like Cloud Build, Cloud Source Repositories, Artifact Registry, and Binary Authorization. Understanding of DevOps practices and principles, including Continuous Integration, Continuous Delivery, Infrastructure as Code, and Monitoring/Alerting.   Security & Compliance: Knowledge of security best practices for cloud environments, including IAM, network security, and data encryption. Understanding of compliance and regulatory requirements related to cloud computing (e.g., GDPR/CCPA, HIPAA, or PCI). Soft Skills: Strong problem-solving skills with the ability to work in a fast-paced environment. Excellent communication skills, with the ability to explain technical concepts to both technical and non-technical stakeholders. Team-oriented mindset with the ability to work collaboratively with cross-functional teams. Certifications (Preferred): Google Professional Cloud DevOps Engineer certification (preferred). Other GCP certifications such as Google Professional Cloud Architect or Associate Cloud Engineer are a plus. DevOps certifications like Certified Kubernetes Administrator (CKA) or AWS/GCP DevOps certification are advantageous.

Source:  workable View original post

Location
Springfield, MO, USA
Show map

workable

You may also like

Craigslist
Software Development Career (Remote) 🧑‍💻
We are seeking career-focused individuals ready to enter the field of technology. This program is ideal if you want to gain real-world coding skills, finish professional projects, and prepare for software and web developer roles. The program is remote and flexible—choose full-time or part-time—offering nearly 900 hours of guided training and project-based work. You’ll practice industry-standard coding languages, developer tools, and workflows while building a strong portfolio and resume to help you get hired. 🖥️ Technology & Programming Fundamentals • Understand how computers, networks, browsers, and the internet work • Learn algorithms, data structures, number systems, and security topics • Hands-on coding with Python, command line utilities, and logic building 💻 Web & Front-End Development • Design websites with HTML5, CSS3, and Bootstrap • Build interactive user experiences with JavaScript, jQuery, and React.js • Apply responsive design and usability principles 🗄️ Back-End & Database Development • Design and manage SQL and SQL Server databases • Execute CRUD operations with relational systems • Create back-end applications with Python (Django) and C# (.NET Framework/Core) 🧑‍💻 Programming Languages & Tools • Work with C#, Python, JavaScript, HTML, CSS, SQL, and more • Use Git, GitHub, Visual Studio, and Team Foundation Server • Learn version control and collaborative programming 🧪 Capstone Projects • Deliver two real-world projects (Python + C#) • Apply Agile, Scrum, and DevOps methods • Gain debugging, teamwork, and problem-solving skills 🧰 Career Preparation • Resume writing, cover letter strategies, and job search tips • Whiteboarding and technical interview practice • Prepare for entry-level developer opportunities 🚀 No tech background needed. Open to remote learners. 👉 Apply now: https://softwaredevpros.online/
2817 Chickasaw St, New Orleans, LA 70126, USA
$30/hour
Workable
Senior Data Engineer - Active Secret Clearance
Location: Washington, DC (Hybrid) Clearance Required: Active Secret Position Type: Full-Time We are seeking a highly skilled Senior Data Engineer to support the U.S. Coast Guard Office of Data & Analytics (ODA). The ideal candidate will bring expertise in Databricks, SQL, ETL pipelines, and cloud-based data frameworks to enable enterprise-scale analytics and decision-making. This role will involve designing, building, and orchestrating data pipelines, collaborating with analysts and developers, and implementing CI/CD practices to ensure scalable, reliable, and secure data solutions. Primary Responsibilities: Build, test, and orchestrate data pipelines in Databricks. Design and optimize data structures, schemas, and ETL processes. Translate business use cases into SQL queries, reports, and dashboards. Manage database objects in development environments without impacting production. Automate workflows using Databricks Workflows or the Jobs API. Integrate code into GitHub/GitLab and support CI/CD practices. Diagnose data gaps and quality issues, and design test cases for validation. Collaborate with developers and analysts to synchronize code and ensure reliability. Requirements Minimum Qualifications: Active Secret clearance (required at time of application). Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field. 7+ years of professional experience in data engineering. Expert-level proficiency in Databricks, Apache Spark, and SQL. Experience with ETL tools such as Kafka, Airflow, or AWS Glue. Proficiency with CI/CD pipelines and GitHub/GitLab version control. Knowledge of cloud platforms (AWS, Azure, GCP). Excellent problem-solving, debugging, and collaboration skills. Preferred Qualifications: Experience supporting federal or mission-critical analytics programs. Knowledge of data governance and metadata management practices. Familiarity with containerization or orchestration tools (Docker, Kubernetes). Eligibility: Must be legally authorized to work in the United States without employer sponsorship, now or in the future. Active Secret clearance required for this role. Benefits Salary: Competitive, commensurate with experience.
Washington, DC, USA
Negotiable Salary
Workable
Senior Data Scientist - Active Secret Clearance
Location: Washington, DC (Hybrid) Clearance Required: Active Secret Position Type: Full-Time We are seeking a highly skilled Senior Data Scientist to support enterprise data and analytics initiatives for our federal client. The ideal candidate will bring expertise in AI/ML development, advanced analytics, and automation to deliver mission-focused solutions. This role will involve providing technical leadership, developing predictive models, and collaborating with cross-functional teams to ensure analytics capabilities are scalable, secure, and aligned with enterprise objectives. Strong proficiency with Python, AI/ML frameworks, and RPA tools is required. Primary Responsibilities: Provide technical leadership on enterprise analytics and AI/ML solutions. Design, develop, and deploy machine learning models and predictive analytics. Utilize Python, Anaconda, and Jupyter notebooks for modeling and prototyping. Collaborate with Data Engineers to integrate models into scalable data pipelines. Support data governance activities including tagging, metadata management, and secure sharing. Develop RPA solutions to automate business processes. Evaluate and recommend analytics products to ensure mission alignment. Communicate findings through dashboards, data visualizations, and technical reports. Requirements Minimum Qualifications: Active Secret clearance (required at time of application). Bachelor’s or Master’s degree in Data Science, Computer Science, Mathematics, or a related field. 7+ years of professional experience in data science, AI/ML, and advanced analytics. Strong proficiency with Python, Anaconda, and Jupyter notebooks. Expertise in AI/ML frameworks such as TensorFlow, PyTorch, and Scikit-learn. Experience with RPA tools (UiPath, Automation Anywhere, or Python-based automation). Familiarity with Databricks, Apache Spark, and SQL. Excellent communication, documentation, and stakeholder engagement skills. Preferred Qualifications: Experience supporting federal or regulated environments. Knowledge of MLOps practices for deploying and managing ML models at scale. Background in data governance and enterprise metadata management. Eligibility: Must be legally authorized to work in the United States without employer sponsorship, now or in the future. Active Secret clearance required for this role. Benefits Salary: Competitive, commensurate with experience.
Washington, DC, USA
Negotiable Salary
Cookie
Cookie Settings
Our Apps
Download
Download on the
APP Store
Download
Get it on
Google Play
© 2025 Servanan International Pte. Ltd.