×
Register Here to Apply for Jobs or Post Jobs. X

GCP Data Engineer

Job in Nashville, Davidson County, Tennessee, 37247, USA
Listing for: Programmers.io
Full Time position
Listed on 2026-02-16
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Salary/Wage Range or Industry Benchmark: 60000 - 80000 USD Yearly USD 60000.00 80000.00 YEAR
Job Description & How to Apply Below

One of our leading client is looking for GCP data Engineer in Nashville TN

Key Responsibilities:
  • Architect Scalable Data Solutions:
    Design and implement data warehouses, marts, lakes, and batch and/or real-time streaming pipelines using GCP-native tools.
  • Data Modeling & Integration:
    Design and Develop conformed data models (star/snowflake schemas) and ETL/ELT processes for analytics and BI tools (Micro Strategy, Looker, Power BI).
  • Pipeline Development:
    Build scalable pipelines and automate data ingestion and transformation workflows using Big Query, Dataflow , Dataproc/PySpark, Cloud Functions, Pub/Sub, Kafka, and Cloud Composer for orchestration.
  • Security & Compliance:
    Implement IAM, encryption, and compliance standards (GDPR, HIPAA) with GCP security tools.
  • Performance Optimization:
    Apply best practices for partitioning, clustering, and BI Engine to ensure high performance and cost efficiency.
  • Dev Ops & Automation:
    Integrate CI/CD pipelines, IaC (Terraform), and containerization (Docker, Kubernetes) for deployment and scalability.
  • Collaboration & Leadership:
    Engage with stakeholders including leadership, Project Managers, BAs, Engineers, QA, platform teams, mentor teams, and provide technical guidance on best practices.
  • Troubleshooting:
    Resolve complex technical issues and support incident response.
  • Healthcare Domain Expertise:
    Ensure compliance with healthcare regulations and stay updated on industry trends.
Required Skills &

Working experience :
  • GCP Expertise:
    Big Query, Cloud Storage, Dataflow (Apache Beam with python), Dataproc/PySpark, Cloud Functions, Pub/Sub, Kafka, Cloud Composer.
  • Programming:
    Advanced SQL and Python for analytics and pipeline development.
  • Performance Optimization:
    Experience with optimization of query performance, partitioning, clustering, and BI Engine, in Big Query.
  • Automation:
    Experience with CI/CD for data pipelines, IaC for data services, automation of ETL/ELT processes.
  • Security:
    Strong knowledge of IAM, encryption, and compliance frameworks.
  • Architecture Design:
    Ability to create fault‑tolerant, highly available, and cost‑optimized solutions.
  • Communication:
    Excellent ability to convey technical concepts to both technical and non‑technical stakeholders.
  • Domain Knowledge:
    Familiarity with healthcare data management and regulatory compliance.
#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary