AWS Cloud Data Engineer
Listed on 2026-02-28
-
IT/Tech
Data Engineer, Cloud Computing, Big Data, Database Administrator
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Analytics Solutions, is seeking the following. Apply via Dice today!
Job Title: AWS Cloud Data Engineer
Location:
Hartford, CT (Hybrid)
We are seeking an experienced AWS Cloud Data Engineer with strong expertise in Mongo
DB, Postgre
SQL, Kafka, and ETL development, along with mandatory experience in the Healthcare domain. The ideal candidate will be responsible for building scalable cloud-based data pipelines, designing transformation layers, and ensuring efficient data movement from streaming platforms into target systems.
- Cloud Data Engineering
45320 - Design, develop, and maintain scalable data pipelines on AWS.
- Load and process streaming data from Kafka into Mongo
DB and Postgre
SQL tables. - Build reliable and high-performance data ingestion frameworks.
- Ensure data quality, integrity, and validation throughout the pipeline.
- Develop ETL processes to transform raw data into curated datasets.
- Design and implement transformation layers for downstream reporting and analytics.
- Optimize data workflows for performance and scalability.
- Manage batch and near real-time data processing.
- Work extensively with Mongo
DB and Postgre
SQL. - Design schemas, optimize queries, and manage indexing strategies.
- Monitor and tune database performance.
- Healthcare Domain
- Work with healthcare data such as claims, EHR, FHIR, HL7, or clinical datasets.
- Ensure compliance with healthcare regulations and data security standards (HIPAA preferred).
- Maintain secure handling of PHI data.
Work closely with data architects, analysts, and application teams. Support production deployments and troubleshoot data-related issues. Participate in code reviews and follow Dev Ops best practices.
RequiredSkills & Qualifications
- 8+ years of experience in Data Engineering.
- Strong hands-on experience in AWS Cloud (Mandatory).
- Experience with Kafka for streaming data ingestion.
- Strong knowledge of Mongo
DB and Postgre
SQL. - Solid experience in ETL development and transformation layer design.
- Strong SQL skills.
- Proficiency in Python or Scala.
- Healthcare domain experience (Mandatory).
- Understanding of data security and compliance (HIPAA knowledge preferred).
- Experience with AWS services like S3, Glue, Lambda, EMR, Redshift.
- Exposure to CI/CD pipelines and Infrastructure as Code.
- Experience with FHIR/HL7 data models.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).