More jobs:
Job Description & How to Apply Below
GCP Data Engineer (Big Query, Dataflow, Composer)
Role
Description:
- Design and build scalable, secure, and high-performance data pipelines on GCP.
- Develop and optimize ETL/ELT workflows using Cloud Composer, Dataflow, Dataproc, and Big Query.
- Implement data ingestion frameworks for batch and streaming data (Pub/Sub, Kafka, Dataflow).
- Model, partition, and optimize datasets in Big Query for analytics use cases.
- Collaborate with data scientists, architects, and business teams to deliver end-to-end data solutions.
- Ensure data quality, reliability, and robustness through monitoring, validation, and automation.
- Implement CI/CD pipelines for data workflows using Cloud Build, Git, and Terraform.
- Optimize cost, performance, and scalability across GCP data services.
- Ensure security best practices, IAM policies, and compliance with organizational standards.
Skills:
- Digital:
Big Data and Hadoop Ecosystems - Digital:
Google Data Engineering
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×