×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer

Job in 110006, Delhi, Delhi, India
Listing for: Insight Global
Full Time position
Listed on 2026-02-18
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Job Description & How to Apply Below
Insight Global is seeking a Data Engineer to join a global leading life sciences client. The successful candidate will build and optimise end‑to‑end data pipelines in Databricks (PySpark) on the Enterprise Data Platform (AWS), ingesting data from multiple ERP and business systems (including Salesforce, SAP, Oracle, and SQL Server) into the data warehouse to support analytics and BI. Day‑to‑day responsibilities include source‑to‑target analysis and mapping, developing SQL and PySpark transformations, implementing tests, tuning performance, and managing production deployments.

You’ll collaborate within a multi‑vendor environment across time zones, coordinate with PMs and a Scrum Master via Jira for backlog and sprint delivery and follow Agile and Dev Ops/Data Ops/Dev Sec Ops  practices using Git Hub for PRs and code reviews.

You’ll also support the evolution of the EDP architecture by working within established frameworks (no net new platform design). The role includes ensuring data quality and integrity across AWS services such as Amazon Redshift and Amazon Athena and requires providing US overlap for meetings and deployment windows.

Please note, this is a long-term contract role and is fully remote.

Must Haves

- Experience delivering data solutions and development experience building data pipelines.
- Experience using Databricks + PySpark and experience using Python and SQL.
- AWS data engineering experience; building cloud BI solutions
- Hands‑on with Amazon Redshift and Amazon Athena.
- Experience ingesting from multiple databases/ERPs (e.g., Salesforce, SAP, Oracle, SQL Server); solid understanding of data ingestion patterns.
- Power BI experience, including Power Query for ETL/transformations.
- Agile delivery with Dev Ops/Data Ops/Dev Sec Ops  practices;
Git Hub (branches, PRs, code reviews).
- Proven collaboration in multi‑region/time‑zone teams; excellent written and verbal communication; strong analysis and requirements documentation skills.

Plusses

- Hands‑on with Snowflake and/or Azure data engineering services.
- Knowledge of SQL and No

SQL stores (e.g., Postgre

SQL, MySQL, Mongo

DB, Cassandra).
- Additional experience building pipelines in Databricks (advanced features, Delta Lake, Unity Catalog).
- Data visualisation with Power BI and/or Tableau beyond core reporting.
- Knowledge of data governance, data quality, security, and related best practices.
- Relevant cloud/data engineering certifications (AWS, Azure, Databricks).
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary