More jobs:
Job Description & How to Apply Below
Role Overview
We are looking for a Senior Data Engineer with 4+ years of experience in building scalable data pipelines and strong hands-on expertise in Google Cloud Platform (GCP ). The ideal candidate will work closely with data science, analytics, and business teams to design, build, and optimize reliable data solutions.
Key Responsibilities
Design, develop, and maintain scalable ETL/ELT data pipelines on GCP.
Build and optimize Big Query datasets, views , partitioned and clustered tables.
Develop batch and near-real-time pipelines using Dataflow / Apache Beam
Ingest data from multiple sources (APIs, databases, files, streaming systems)
Implement data quality checks, validation, and monitoring
Optimize query performance and control GCP cost usage
Work with Cloud Storage, Pub/Sub, Composer (Airflow) for orchestration
Collaborate with Data Scientists & Analysts to support ML and BI use cases
Ensure security, IAM, and governance best practices
Support production deployments, monitoring and troubleshooting
Participate in data architecture and design discussions, proposing best practices and improvements
Optimize performance, reliability, and cost of data solutions
Implement data quality checks, validation, and reconciliation logic in Python
Programming Languages
SQL
Python
PySpark
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×