GCP Data Engineer
Listed on 2026-02-19
-
IT/Tech
Data Engineer, Cloud Computing
Job Description
A fortune 250 enterprise organization in Memphis, TN is looking for a GCP Data Engineer to join their Customer Satisfaction team. This organization is a large, enterprise‑scale company focused on leveraging data and cloud technologies to drive business insights, digital transformation, and advanced analytics. The team is investing heavily in modern, cloud‑native data platforms and is expanding its Data Engineering organization to support growing analytics and machine learning initiatives.
The GCP Data Engineer will design, build, and optimize scalable data pipelines and cloud‑native data platforms on Google Cloud Platform (GCP). This role partners closely with Data Architects, Analysts, Data Scientists, and business stakeholders to deliver high‑quality, reliable, and secure data solutions that support enterprise analytics, reporting, and machine learning use cases.
Data Engineering & Architecture- Design, build, and maintain large‑scale, high‑performance data pipelines using GCP services such as Big Query, Cloud Storage, Dataflow, Dataproc, Pub/Sub, Composer, and Cloud Run
- Develop ETL/ELT workflows using Apache Beam, Spark, or Python
- Implement cloud‑native data architectures including data lakes, data marts, and enterprise data warehouses
- Optimize data models for performance, cost efficiency, and scalability
- Integrate data from a wide variety of internal and external sources, both batch and streaming
- Automate data quality checks, schema validation, and lineage tracking
- Build reusable frameworks and templates to accelerate pipeline development
- Partner with application teams, product owners, and analysts to understand business data needs
- Support Data Scientists by enabling feature pipelines, ML datasets, and efficient training data flows
- Troubleshoot production data pipeline issues and implement long‑term improvements
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances.
If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy:
- Bachelor’s degree in Computer Scienceo or related field
- 5+ years of hands‑on experience as a Data Engineer
- Strong expertise with Google Cloud Platform (GCP), especially Big Query, Dataflow, Pub/Sub, and Cloud Storage
- Proficiency in Python, SQL, and one or more frameworks like Apache Beam or Spark.
- Experience with Airflow/Cloud Composer orchestration workflows.
- Experience with data modeling, distributed systems, and ETL/ELT
- CI/CD, Terraform, Git, or related Dev Ops tooling.
- Experience with streaming platforms such as Kafka or Pub/Sub.
- Knowledge of ML pipelines using Vertex AI
- Familiarity with Data Governance tools (e.g., Data Catalog, Collibra, Alation). GCP Professional Data Engineer certification
- Experience working in enterprise retail or large-scale operational data environments.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).