×
Register Here to Apply for Jobs or Post Jobs. X

ETL Developer

Job in 600001, Chennai, Tamil Nadu, India
Listing for: Tag
Full Time position
Listed on 2026-02-04
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst, Data Warehousing, Data Science Manager
Job Description & How to Apply Below
Position Summary:

We are seeking a highly skilled ETL Developer with 5–8 years of experience in data integration, transformation, and pipeline optimization. This role is a key part of our Data Engineering function within the Business Intelligence team, responsible for enabling robust data flows that power enterprise dashboards, analytics, and machine learning models. The ideal candidate has strong SQL and scripting skills, hands-on experience with cloud ETL tools, and a passion for building scalable data infrastructure.

Education

Qualification:

- B. Tech (CS, Elec), MCA or higher.

Key Responsibilities:

- Design, develop, and maintain ETL pipelines that move and transform data across internal and external systems.
- Collaborate with data analysts, BI developers, and data scientists to support reporting, modeling, and insight generation.
- Build and optimize data models and data marts to support business KPIs and self-service BI.
- Ensure data quality, lineage, and consistency across multiple source systems.
- Monitor and tune performance of ETL workflows, troubleshoot bottlenecks and failures.
- Support the migration of on-premise ETL workloads to cloud data platforms (e.g., Snowflake, Redshift, Big Query).
- Implement and enforce data governance, documentation, and operational best practices.
- Work with Dev Ops/Data Ops teams to implement CI/CD for data pipelines.

Required Qualifications:

- 5–8 years of hands-on experience in ETL development or data engineering roles.
- Advanced SQL skills and experience with data wrangling on large datasets.
- Proficient with at least one ETL tool (e.g., Informatica, Talend, AWS Glue, SSIS, Apache Airflow, or Domo Magic ETL).
- Familiarity with data modeling techniques (star/snowflake schemas, dimensional models).
- Experience working with cloud data platforms (e.g., AWS, Azure, GCP).
- Strong understanding of data warehouse concepts, performance optimization, and data partitioning.

- Experience with Python or scripting languages for data manipulation and automation.

Preferred Qualifications:

- Exposure to BI platforms like Domo, Power BI, or Tableau.
- Knowledge of CI/CD practices in a data engineering context (e.g., Git, Jenkins, dbt).
- Experience working in Agile/Scrum environments.
- Familiarity with data security and compliance standards (GDPR, HIPAA, etc.).

- Experience with API integrations and external data ingestion.
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary