More jobs:
Job Description & How to Apply Below
Lead AWS Data Engineer (Lakehouse / Medallion Architecture)
Location:
Hyderabad India (Hyderabad)
Industry: Insurance / Financial Services
Experience Required : 8–15 Years
Mode of Work: 5 Days Work from Office
We are looking for an experienced AWS Data Engineers with deep hands-on expertise in building scalable, resilient data platforms using Lakehouse / Medallion architecture (Bronze / Silver / Gold) .
This role is for engineers who understand not just how to build pipelines — but how to build them correctly, cleanly, and sustainably at scale.
What You Will Do
Design and operate scalable data pipelines using PySpark/Spark, Python, and Advanced SQL
Implement Medallion architecture (Bronze/Silver/Gold) with clear separation of concerns
Design incremental processing, SCD Type 2, CDC/Merge logic
Handle late-arriving & out-of-sequence data
Build idempotent, resilient, and re-processable pipelines
Provision and manage AWS infrastructure using Terraform
Apply strong data quality, schema validation, and operational best practices
Required Technical Skills
✔ Python + Advanced SQL
✔ PySpark / Spark
✔ AWS data services (S3, Glue, EMR, Lambda, Step Functions, etc.)
✔Terraform using IaC for AWS
✔ Lakehouse / Medallion Architecture
✔ Type 2 Dimensions, CDC/Merge processing
✔ Incremental & historical processing patterns
✔ Deterministic deduplication techniques
✔ Data modeling fundamentals
✔ Strong operational & data quality mindset
You Must Be Able to Clearly Explain
Handling updates/deletes from relational sources
Deterministic deduplication logic
Late-arriving & out-of-sequence data strategies
Idempotent pipeline design
Merge keys & ordering columns
Event time vs. ingest time
Soft deletes & reprocessing windows
We value engineers who understand why patterns matter , not just how to implement them.
What Makes This Role Exciting
Work on enterprise-grade data platforms
Build clean, scalable, cloud-native architectures
Influence engineering standards & best practices
Opportunity to work in insurance and financial data ecosystems
High visibility & impact within a growing global delivery model
If you are passionate about building robust, production-grade AWS data pipelines and want to work on real enterprise transformation programs — let’s connect.
DM me or apply directly.
#Hiring #AWS #Data Engineering #PySpark #Terraform #Lakehouse #Medallion Architecture #Big Data #Insurance Tech #Cloud Data
About Value Momentum
Value Momentum is a leading Insurance-focused IT Services and Solutions company, partnering with carriers across the US, UK, and Canada to drive modernization and digital transformation. With deep domain expertise in Property & Casualty insurance, we combine industry knowledge with strong engineering capabilities across core systems, data & analytics, cloud, and quality engineering
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×