×
Register Here to Apply for Jobs or Post Jobs. X

Informatica ETL Developer

Job in Bengaluru, 560001, Bangalore, Karnataka, India
Listing for: Tata Consultancy Services
Full Time position
Listed on 2026-02-04
Job specializations:
  • IT/Tech
    Data Engineer, Data Warehousing, Data Analyst, Big Data
Job Description & How to Apply Below
Location: Bengaluru

Dear Candidates,
Greetings from TCS!!!
TCS is looking for Informatica ETL Developer

Experience:

5-8 years

Location:

PAN India

Required Technical Skill Set:
Informatica ETL Developer

Must have skill:
Experience in ETL development with Informatica Power Center (experience with BDM/CDI preferred).
Strong knowledge of PL/SQL, Greenplum, and data warehousing concepts.
Hands-on experience with Hadoop ecosystem (HDFS, Hive) and Spark for big data processing.
Familiarity with Kafka and real-time streaming ETL.

Experience with Unix/Linux scripting, scheduling tools, and workflow automation.
Understanding of data governance, metadata management, and compliance (GDPR, PII masking).

Good to have:
Exposure to cloud-native ETL architectures and containerization (Kubernetes) is a plus.

Roles and responsibilities:

The Senior Informatica ETL Developer will design, develop, and optimize ETL workflows to support enterprise data ecosystem, which includes  Greenplum Data Warehouse ,  HDFS-based Data Lake , and  real-time streaming pipelines . This role ensures efficient data integration, high performance, and compliance with governance standards while enabling analytics and BI platforms such as Micro Strategy, Power BI, and Tableau.
Design, develop, and maintain  ETL workflows using Informatica Power Center and Informatica BDM/CDI  for batch and streaming data ingestion.
Integrate data from  structured, semi-structured, and unstructured sources  into  Greenplum Data Warehouse  and  HDFS Data Lake .
Collaborate with  data architects, BI teams, and data governance teams  to ensure alignment with architecture and compliance requirements.
Implement  error handling, logging, recovery mechanisms , and  data quality checks .
Perform  performance tuning  of ETL processes, SQL queries, and optimize for  MPP engines .
Support  metadata management and lineage  using  Informatica EDC .
Provide  production support , troubleshoot ETL failures, and ensure pipeline observability.
Contribute to  real-time data integration  leveraging  Kafka  and streaming frameworks.
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary