×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer

Job in 110006, Delhi, Delhi, India
Listing for: IntraEdge
Full Time position
Listed on 2026-02-04
Job specializations:
  • IT/Tech
    Data Engineer, Big Data, Cloud Computing, Data Warehousing
Job Description & How to Apply Below
We are seeking a highly skilled Data Engineer with strong experience in Python, PySpark, Snowflake, and AWS Glue to join our growing data team. You will be responsible for building scalable and reliable data pipelines that drive business insights and operational efficiency. This role requires a deep understanding of data modeling, ETL frameworks, and cloud-based data platforms.

Job Title:

Data Engineer

Location:

Remote

Full-Time

Responsibilities:

Design, develop, and maintain scalable ETL/ELT pipelines using PySpark and AWS Glue.

Build and optimize data models in Snowflake to support reporting, analytics, and machine learning workloads.

Automate data ingestion from various sources including APIs, databases, and third-party platforms.

Esure high data quality and implement data validation and monitoring processes.

Collaborate with data analysts, data scientists, and other engineers to understand data requirements and deliver high-quality solutions.

Work with large datasets in structured and semi-structured formats (JSON, Parquet, Avro, etc.).

Participate in code reviews, performance tuning, and debugging of data workflows.

Implement security and compliance measures across data pipelines and storage.

Required Skills:

5+ years of experience as a Data Engineer or similar role.

Proficiency in Python for data manipulation and scripting.

Experience with PySpark for large-scale data processing.

Strong expertise in Snowflake – data modeling, performance tuning, and SnowSQL.

Experience with AWS Glue, including Glue Jobs, Crawlers, and Data Catalog integration.

Experience with other AWS services like S3, Lambda, Athena, Cloud Watch, etc.

Familiarity with CI/CD pipelines and version control tools (e.g., Git).

Understanding of data warehousing concepts, dimensional modeling, and ETL best practices.

Problem-solving skills and attention to detail.

Must have skills:

Python

Py Spark

AWS

AWS Glue

Lambda

S3

Snowflake

Redshift

SQL

DBT
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary