More jobs:
Job Description & How to Apply Below
Senior Data Engineer – GCP
Experience:
10+ Years
Job Type: Fulltime
Note:
Kindly note that this position is open only for candidates who are currently residing in Hyderabad. Outstation profiles will not be considered.
Position Overview:
We are seeking a highly skilled Senior Data Engineer with over 10 years of experience in designing, building, and optimizing large-scale data solutions within the Google Cloud Platform (GCP) ecosystem. The successful candidate will work closely with business stakeholders, data scientists, and application teams to design pipelines and data architectures that feed into Big Query and manage enterprise-wide data governance using Dataplex.
This role requires strong SQL expertise, deep knowledge of GCP-native services, and the ability to ensure data solutions are scalable, reliable, and secure.
Key Responsibilities:
- Design and implement scalable data pipelines to ingest, process, and manage structured and unstructured data in GCP.
- Develop and optimize Big Query data warehouses for analytics, reporting, and machine learning workloads.
- Leverage Dataplex to establish data governance, metadata management, lineage, and quality frameworks across the enterprise.
- Collaborate with data scientists, analysts, and application developers to enable advanced analytics and AI/ML workloads.
- Implement data security, compliance, and access controls aligned with organizational and regulatory requirements.
- Automate infrastructure and workflows using Terraform, Cloud Composer (Airflow), and CI/CD pipelines.
- Monitor, troubleshoot, and optimize data workflows for cost efficiency, reliability, and performance.
- Document solution designs, data models, and operational procedures to ensure maintainability and knowledge transfer.
- Stay current with emerging GCP services, data engineering best practices, and modern data platform patterns.
Required Skills:
- Minimum of 10 years in data engineering, with at least 5 years in GCP-based environments.
- Strong proficiency in SQL (including query optimization and advanced analytical functions).
- Hands-on experience with Big Query (partitioning, clustering, optimization, cost control).
- Expertise in Dataplex for governance, metadata, and lifecycle management.
- Experience with ETL/ELT pipeline development using Dataflow, Dataproc, Pub/Sub, Cloud Functions, or Cloud Run.
- Knowledge of data modeling techniques (star schema, snowflake, data vault) and best practices for large-scale analytics platforms.
- Familiarity with Python or Java for building scalable data pipelines.
- Knowledgeable in Terraform or Deployment Manager for infrastructure as code.
- Experience with CI/CD tools (e.g., Cloud Build, Git Lab CI/CD).
- Knowledge of relational and No
SQL databases (e.g., Postgre
SQL, Cloud Spanner, Firestore).
Preferred Qualifications:
- Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field (or equivalent experience).
- Google Cloud Professional Data Engineer or Cloud Architect Certification strongly preferred.
- Exposure to machine learning pipelines and integration with Vertex AI is a plus.
- Experience with real-time data streaming using Pub/Sub or Kafka.
Position Requirements
10+ Years
work experience
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×