Job Description & How to Apply Below
Exp: 3.5-12 Yrs
Location:
Hyderabad
Skills:
ETL,DWH,SQL,GCP-Bigquery,Dataflow,Cloud storage, Cloud composer and DBT Tool-Collibra
Please share your resumes to ,
Job Description:
Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or related technical field.
Minimum 4+ years of hands-on experience in data engineering with strong expertise in data warehousing, pipeline development, and analytics on cloud platforms.
Expert-level experience in:
Google Big Query for large-scale data warehousing and analytics.
Python for data processing, orchestration, and scripting.
SQL for data wrangling, transformation, and query optimization.
DBT for developing modular and maintainable data transformation layers.
Airflow / Cloud Composer for workflow orchestration and scheduling.
Proven experience building enterprise-grade ETL/ELT pipelines and scalable data architectures.
Strong understanding of data quality frameworks, validation techniques, and governance processes.
Proficiency in Agile methodologies (Scrum/Kanban) and managing IT backlogs in a collaborative, iterative environment.
Preferred experience with:
Tools like Ascend.io, Databricks, Fivetran, or Dataflow.
Data cataloging/governance tools (e.g., Collibra).
CI/CD tools, Git workflows, and infrastructure automation.
Real-time/event-driven data processing using Pub/Sub, Kafka, or similar platforms.
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×