More jobs:
Databricks Google Cloud Platform; GCP Data Engineer Nashville
Job in
Nashville, Davidson County, Tennessee, 37201, USA
Listed on 2026-03-04
Listing for:
Capgemini
Full Time
position Listed on 2026-03-04
Job specializations:
-
IT/Tech
Data Engineer, Cloud Computing, Big Data, Data Science Manager
Job Description & How to Apply Below
Job Summary
:
Capgemini is a global business and technology transformation partner, and they are seeking a skilled Data Engineer with hands-on experience in Databricks and Google Cloud Platform (GCP) to design, build, and optimize data pipelines and analytics solutions. The role involves collaborating with data analysts, data scientists, and business stakeholders to deliver scalable, reliable, and high-quality data products.
Responsibilities
:
• Design, build, and maintain ETL/ELT pipelines using Databricks (PySpark, Delta Lake).
• Optimize pipelines for performance, cost efficiency, and scalability within GCP.
• Develop batch and streaming data processes using Spark Streaming, and related technologies.
• Implement data solutions leveraging GCP services such as Big Query, Cloud Storage, Dataflow, Cloud Composer, and Vertex AI integrations.
• Apply best practices for cloud security, IAM configuration, monitoring, and cost management.
• Build and maintain data models, including dimensional modeling and data vault structures.
• Implement data quality frameworks, validation rules, and automated testing.
• Manage data versioning, governance, and lineage using tools such as Unity Catalog or GCP Data Catalog.
• Partner with cross‑functional teams to gather requirements and translate them into technical designs.
• Provide technical guidance and influence engineering best practices across the team.
• Contribute to documentation, architectural diagrams, and knowledge sharing.
Qualifications
:
Required
:
• Data Engineer or similar role.
• Strong hands-on experience with Databricks, including:
PySpark/Spark, Delta Lake, Databricks workflows/jobs
• Proficiency with GCP:
Big Query, Cloud Storage, Dataflow or Dataproc
• Strong coding skills in Python and SQL.
• Solid understanding of distributed systems, data warehousing, and data architecture principles.
• Experience with CI/CD tools (Git Hub, Git Lab, Azure Dev Ops, or similar).
Preferred
:
• Databricks or GCP certifications (e.g., Data Engineer, Architect).
• Experience with Terraform or other Infrastructure-as-Code tools.
• Knowledge of ML workflows or MLOps frameworks.
• Familiarity with data governance tools (Unity Catalog, Great Expectations, dbt, etc.).
Company
:
Capgemini is a software company that provides consulting, technology, and digital transformation services. Founded in 1967, the company is headquartered in Paris, FRA, with a team of 10001+ employees. The company is currently Late Stage.
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×