×
Register Here to Apply for Jobs or Post Jobs. X

Senior ETL Developer

Job in Bengaluru, 560001, Bangalore, Karnataka, India
Listing for: Confidential
Full Time position
Listed on 2026-02-04
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing, Data Analyst, Data Warehousing
Job Description & How to Apply Below
Location: Bengaluru

Blitzen

X is hiring a   Senior ETL Developer  to lead enterprise-grade data integration initiatives across complex, high-throughput environments. This is not a maintenance role — we're looking for a   builder , a   problem-solver , and a   technical authority  who can design, optimize, and deliver production-ready ETL solutions in real time.

Key Responsibilities :

Design, build, and deploy scalable ETL pipelines across cloud and on-prem environments.
Integrate structured and semi-structured data from diverse sources into centralized data platforms.
Tune performance of large-scale data workflows for speed, resilience, and cost-efficiency.
Collaborate with cross-functional teams to translate business rules into ETL logic.
Enforce best practices in data engineering, governance, and quality assurance.
Perform production support, defect triage, and root cause analysis with urgency and ownership.
Maintain clear, concise technical documentation and mapping specifications.

Must-Have Skills :

6+ years of hands-on ETL development using tools such as Informatica, Talend, SSIS, or equivalent.
Expert-level SQL development and performance optimization.
Strong background in RDBMS (Oracle, SQL Server, Postgre

SQL) and cloud data platforms (Snowflake, Redshift, Big Query).
Solid understanding of dimensional modeling and data warehousing best practices.
Proficiency in scripting (Python, Shell) for automation and orchestration.
Familiarity with workflow schedulers like Airflow, Control-M, or Autosys.
Experience working within Git-based source control and CI/CD pipelines.
Agile delivery mindset and strong communication/documentation skills.

Nice-to-Have :

Experience with modern cloud ecosystems (AWS, GCP, Azure)
Exposure to data quality, metadata, or lineage tools.
Familiarity with Docker/Kubernetes in a Data Ops environment.
Position Requirements
10+ Years work experience
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary