More jobs:
Google Cloud Data Migration Lead
Remote / Online - Candidates ideally in
Dallas, Dallas County, Texas, 75215, USA
Listed on 2026-01-01
Dallas, Dallas County, Texas, 75215, USA
Listing for:
Robotics Prcocess Automation, LLC
Seasonal/Temporary, Remote/Work from Home
position Listed on 2026-01-01
Job specializations:
-
IT/Tech
Data Engineer, Cloud Computing
Job Description & How to Apply Below
Options Google Cloud Data Migration Lead
Location:
Duration:
18
Months+ Extension
Hourly Rate:
Depending on Experience (DOE)
Work Authorization:
Location:
Remote
Duration:
Long term contract
Seeking a Google Cloud data engineer to design, build, and maintain scalable and efficient data processing systems on the Google Cloud platform. This engineer will be responsible for the entire data lifecycle, from ingestion and storage to processing, transformation, and analysis. Their work will enable client organizations to make data-driven decisions by providing clean, high-quality data to business intelligence tools, AI systems, analysts and data scientists.
Key Responsibilities:- Serve as Data Migration team leader for a large data and application migration to the Google Cloud platform.
- As Data Migration team leader, this individual will be responsible for our team's endto-end data architecture and migration planning to support the migration effort as well as future-state client efforts on the Google Cloud platform.
- As Data Migration team leader, this individual will collaborate closely with the overall Google Cloud migration team leadership, working to deliver a successful application and data migration.
- Design and build data pipelines:
Develop and maintain reliable and scalable batch and real-time data pipelines using Google Cloud Platform tools such as Cloud Dataflow (based on Apache Beam), Cloud Pub/Sub, and Cloud Composer (for Apache Airflow). - Create and manage data storage solutions:
Implement data warehousing and data lake solutions using Google Cloud Platform products like Big Query, Cloud Storage, and other transactional or No
SQL databases such as Cloud
SQL or Bigtable. - Ensure data quality and integrity:
Develop and enforce procedures for data governance, quality control, and validation throughout the data pipeline to ensure data is accurate and reliable. - Optimize performance and cost:
Monitor data infrastructure and pipelines to identify and resolve performance bottlenecks, ensuring that all data solutions are cost-effective and scalable. - Collaborate with other teams:
Work closely with data scientists, analysts, and business stakeholders to gather requirements and understand data needs, translating them into technical specifications. - Automate and orchestrate workflows:
Automate data processes and manage complex workflows using tools like Cloud Composer. - Implement security:
Design and enforce data security and access controls using Google Cloud Platform Identity and Access Management (IAM) and other best practices. - Maintain documentation:
Create and maintain clear documentation for data pipelines, architecture, and operational procedures.
Qualifications:
- 8+ years of data engineering experience developing large data pipelines in very complex environments
- Very Strong SQL skills and ability to build very complex transformation data pipelines using custom ETL framework in Google Big Query environment
- Very strong understanding of data migration methods and tooling, with hands‑on experience in at least three (3) data migrations to Google Cloud
Hands‑on experience with key Google Cloud Platform data services is essential, including:
- Big Query:
For data warehousing and analytics. - Cloud Dataflow:
For building and managing data pipelines. - Cloud Pub/Sub:
For real‑time messaging and event ingestion. - Data Proc:
For running Apache Spark and other open‑source frameworks. - Programming languages:
Strong proficiency in programming languages, most commonly Python, is mandatory. Experience with Java or Scala is also preferred. - SQL expertise:
Advanced SQL skills for data analysis, transformation, and optimization within Big Query and other databases. - ETL/ELT:
Deep knowledge of Extract, Transform, Load (ETL) and Extract, Load, Transform (ELT) processes. - Infrastructure as Code (IaC):
Experience with tools like Terraform for deploying and managing cloud infrastructure. - CI/CD:
Familiarity with continuous integration and continuous deployment (CI/CD) pipelines using tools such as Git Hub Actions or Jenkins. - Data modeling:
Understanding of data modeling, data warehousing, and data lake concepts
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×