×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer

Job in 201301, Noida, Uttar Pradesh, India
Listing for: Varahe Analytics Pvt Ltd
Full Time position
Listed on 2026-02-17
Job specializations:
  • IT/Tech
    Data Engineer, Data Security
Job Description & How to Apply Below
Job Description - Senior Data Engineer
Department - Analytics & Engineering
Team
- Quantitative Analytics
Location
- Noida
Experience - 4+years

About Varahe Analytics
Varahe Analytics is India’s leading integrated political consulting firm, delivering data-driven, end-to-end election management solutions. We combine advanced analytics, strategic advisory, and deep on-ground intelligence to design and execute 360-degree electoral strategies. Our work spans research and insights, field outreach, media and communications, and campaign operations - built to influence narratives, improve decision-making, and drive measurable outcomes. Backed by a high-calibre, multidisciplinary team from premier institutions, we partner closely with political leadership to deliver high-impact programmes at scale.

Role Overview
As a Senior Data Engineer (L1) at Varahe Analytics, you will build and own reliable, scalable data workflows that power analysis, modelling, reporting, and leadership decision-making. You will integrate data from multiple sources (field, survey, electoral, digital, and internal systems), standardize it into trusted datasets, and ensure teams operate on consistent metric definitions and a clear single source of truth.
This is a hands-on role with strong ownership: you will ensure day-to-day pipeline health, implement structured reliability improvements, and set the bar on documentation, review practices, traceability, and governance - especially important given the sensitivity of political and citizen-level data. You will also mentor junior engineers and drive operational discipline as the data function scales.

What You'll Do
Build production-grade data pipelines that support polling, constituency analytics, modelling, dashboards, and reporting.
Integrate and harmonize data from multiple sources, ensuring clean joins and consistent identifiers.
Create trusted datasets and curated marts that teams can use repeatedly without rework - reducing manual effort and improving turnaround time.
Strengthen governance: access controls, secure sharing, auditability, and compliance-aligned handling of sensitive data.
Raise engineering standards: code reviews, testing discipline, documentation, runbooks, and reproducible workflows.
Mentor junior engineers through pairing, design reviews, and structured coaching.

Key Responsibilities
Data Platform & Architecture
Architect, build, and operate scalable, fault-tolerant data pipelines for structured and unstructured data.
Drive operational excellence: SLAs, monitoring, incident response, and pipeline performance tuning.
Design and maintain data lakes/warehouses and ELT/ETL frameworks across cloud and hybrid setups.
Own and evolve AWS-based data architecture,designing secure, scalable pipelines and storage/compute patterns, optimizing cost and performance, and ensuring reliable deployments through monitoring, alerting, and infrastructure best practices.
Governance, Security & Quality
Establish best practices for data lineage, access controls, secure handling, documentation, and observability.
Implement automated validation checks, anomaly detection, and runbooks; conduct RCA for recurring issues.
Stakeholder Partnership
Partner closely with analysts, application engineers, and strategy/campaign teams to translate business needs into robust datasets and metrics.
Ensure consistent metric definitions and prevent multiple versions of the truth.

Must-Have Qualifications
5+ years of hands-on Data Engineering experience with a proven track record of building and operating production systems.
Hands-on experience designing and operating scalable, secure data architectures on AWS, including core services such as S3, IAM, Glue, EMR/EKS/ECS, Lambda, Redshift, Cloud Watch, with a strong understanding of VPC networking, security best practices, cost optimization, fault tolerance, and high availability.
Advanced proficiency in Python, & SQL.
Strong experience with big-data/distributed systems such as Airflow, Hive, Presto/Trino (or equivalents).
Hands-on expertise with lake/warehouse architectures:
Delta Lake, Snowflake, Big Query, Redshift (or similar).
Strong understanding of ETL/ELT, data modelling (star/snowflake schemas), orchestration patterns, and engineering best practices.
Exposure to streaming/real-time systems:
Kafka/Kinesis/Pub Sub, Spark Structured Streaming, Flink, etc.

Experience with CI/CD for data systems, containerization, and infrastructure-as-code (e.g., Docker, Terraform).
Strong ownership mindset - ability to run systems end-to-end and improve reliability over time.

Good-to-Have Qualifications
Familiarity with PII/consent-sensitive data handling, access governance, and audit logging practices.
Working knowledge of GIS concepts (admin boundaries, ward/booth mapping, spatial joins) and tools (PostGIS/QGIS).

What Success Looks Like ( First 180 Days)
Junior engineers are enabled through documentation, templates, and review discipline.
Review existing codebase and suggest improvements wherever necessary
Critical…
Position Requirements
10+ Years work experience
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary