Data Engineer
Ridgefield, Fairfield County, Connecticut, 06877, USA
Listed on 2026-01-01
-
IT/Tech
Data Engineer, Cloud Computing
Location: Ridgefield, Connecticut, United States
Work Arrangement: Flexible work-from-home days (onsite 2-3x per week)
Openings: 2
Step into the future with our Enterprise Data, AI & Platforms (EDP) team! At our company, we harness Data & AI to transform healthcare
, positively impacting the lives of millions of patients and animals. As part of the EDP team, you will contribute to building a strong data-driven culture
, drive key data transformation initiatives
, and shape the future of decision-making across our global organization.
We are seeking a highly skilled and experienced Data Engineer to design, build, and maintain scalable data infrastructure on a cloud platform. You will be responsible for data pipelines, ETL processes, and overall data architecture strategy
, ensuring data availability, quality, and integrity for business stakeholders and analytics teams.
- Design, develop, and maintain scalable data pipelines and ETL/ELT processes.
- Collaborate with data architects, modelers, IT, and business stakeholders to define and evolve cloud-based data architecture.
- Optimize data storage solutions (e.g., S3, Snowflake, Redshift), ensuring data integrity, security, and accessibility.
- Implement data quality, validation processes, and monitoring frameworks.
- Maintain documentation for data workflows, architecture, and pipeline processes.
- Troubleshoot and optimize data pipeline performance.
- Engage with clients and stakeholders to analyze requirements and recommend data solutions.
- Stay current with emerging technologies and industry trends in cloud and data engineering.
- Associate degree in Computer Science/MIS (4+ years experience) or Bachelor's (2+ years) or Master's (1+ year) in related field.
- Hands‑on experience with AWS services (Glue, Lambda, Athena, Step Functions, Lake Formation).
- Proficiency in Python and SQL.
- Familiarity with Dev Ops/CI/CD principles and project lifecycle methodologies.
- Moderate knowledge of cloud platforms (AWS, Azure, GCP) and data integration concepts.
- Associate degree (8+ years experience) or Bachelor's (4+ years) or Master's (2+ years) in relevant field.
- Expert‑level experience in cloud platforms, preferably AWS.
- Advanced SQL skills, data modeling, and data warehousing concepts (Kimball, star/snowflake schemas).
- Experience with big data frameworks (Spark, Hadoop, Flink) and relational/No
SQL databases. - Hands‑on experience with ETL/ELT tools (Airflow, dbt, AWS Glue).
- Knowledge of Dev Ops/CI/CD for data solutions.
- 4+ years of progressive data engineering experience with cloud-based data platforms.
- Understanding of data governance, data quality, and metadata management.
- Familiarity with Snowflake and dbt (data build tool).
- Strong problem‑solving skills in pipeline troubleshooting and optimization.
- AWS Solutions Architect certification is a plus.
- AWS services:
Glue, Lambda, Athena, Step Functions, Lake Formation - Programming
Languages:
Python, SQL
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).