×
Register Here to Apply for Jobs or Post Jobs. X

GCP DevOps Engineer

Job in Nashville, Davidson County, Tennessee, 37247, USA
Listing for: KANINI
Full Time position
Listed on 2026-02-16
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Salary/Wage Range or Industry Benchmark: 80000 - 100000 USD Yearly USD 80000.00 100000.00 YEAR
Job Description & How to Apply Below

We are seeking an experienced GCP Data Engineer to join our data engineering team onsite in Denver, CO
. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and analytics solutions on Google Cloud Platform (GCP). This role requires strong hands-on experience with GCP native services, data modeling, and large-scale data processing.

Key Responsibilities
  • Design, develop, and maintain ETL/ELT data pipelines on GCP
  • Build and optimize data solutions using Big Query, Cloud Dataflow, Dataproc, and Cloud Composer
  • Develop batch and streaming data pipelines using Apache Beam, Spark, and Pub/Sub
  • Implement data ingestion from multiple sources (APIs, databases, flat files, streaming systems)
  • Ensure data quality, reliability, performance, and security best practices
  • Collaborate with data scientists, analysts, and application teams to support analytics and reporting needs
  • Optimize query performance and cost management in Big Query
  • Implement CI/CD pipelines and infrastructure automation using Terraform or Deployment Manager
  • Monitor and troubleshoot data pipelines in production environments
  • Follow data governance, compliance, and security standards
Required

Skills & Qualifications
  • 6+ years of experience in Data Engineering
  • Strong hands-on experience with Google Cloud Platform (GCP)
  • Expertise in Big Query, Cloud Storage, Pub/Sub, Dataflow, Dataproc
  • Strong programming skills in Python and/or Java
  • Experience with Apache Spark, Apache Beam
  • Solid understanding of data warehousing, data modeling, and SQL
  • Experience with Airflow / Cloud Composer
  • Familiarity with CI/CD pipelines and Infrastructure as Code (Terraform)
  • Experience working in Agile/Scrum environments
Nice to Have
  • GCP Professional Data Engineer Certification
  • Experience with real-time/streaming data pipelines
  • Knowledge of machine learning data pipelines on GCP
  • Experience with Looker or other BI tools
#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary