×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer

Job in 110006, Delhi, Delhi, India
Listing for: IntraEdge
Full Time position
Listed on 2026-02-20
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing, Big Data
Job Description & How to Apply Below
Job Title:

Senior Snowflake Data Engineer

Location:

Remote
Experience :7+ Years
Job Type: Full-Time

About the Role:

We are seeking a highly skilled  Senior Snowflake Data Engineer  with strong experience in  Python, PySpark, Snowflake , and  AWS Glue  to join our growing data team. You will be responsible for building scalable and reliable data pipelines that drive business insights and operational efficiency. This role requires a deep understanding of data modelling, ETL frameworks, and cloud-based data platforms.

Key Responsibilities:

Design, develop, and maintain scalable ETL/ELT pipelines using  PySpark  and  AWS Glue .
Build and optimize data models in  Snowflake  to support reporting, analytics, and machine learning workloads.
Automate data ingestion from various sources including APIs, databases, and third-party platforms.
Ensure high data quality and implement data validation and monitoring processes.
Collaborate with data analysts, data scientists, and other engineers to understand data requirements and deliver high-quality solutions.
Work with large datasets in structured and semi-structured formats (JSON, Parquet, Avro, etc.).
Participate in code reviews, performance tuning, and debugging of data workflows.
Implement security and compliance measures across data pipelines and storage.

Required Skills &

Qualifications:

7+ years  of experience as a Data Engineer or similar role.
Proficiency in  Python  for data manipulation and scripting.
Hands-on experience with  PySpark  for large-scale data processing.
5+ years expertise in  Snowflake  – data modeling, performance tuning, and SnowSQL.
Solid experience with  AWS Glue , including Glue Jobs, Crawlers, and Data Catalog integration.

Experience with other AWS services like  S3, Lambda, Athena, Cloud Watch , etc.
Familiarity with CI/CD pipelines and version control tools (e.g., Git).
Understanding of data warehousing concepts, dimensional modeling, and ETL best practices.
Excellent problem-solving skills and attention to detail.

Preferred Qualifications:

AWS Certification (e.g., AWS Certified Data Analytics – Specialty or AWS Certified Solutions Architect).

Experience with orchestration tools such as  Apache Airflow  or  AWS Step Functions .
Exposure to Dev Ops practices and infrastructure-as-code (e.g., Terraform or Cloud Formation).
Experience working in Agile/Scrum environments.
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary