×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer

Job in Snowflake, Navajo County, Arizona, 85937, USA
Listing for: Carrot Fertility, Inc.
Full Time position
Listed on 2026-01-01
Job specializations:
  • IT/Tech
    Data Engineer, Data Security
Job Description & How to Apply Below

Overview

Carrot is a global, comprehensive fertility and family care platform, supporting members and their families through many of life’s most memorable moments. Trusted by leading multinational employers, health plans, and health systems, Carrot’s clinical program delivers outcomes and cost-savings for employers. Its products cover preconception care through pregnancy, IVF, male factor infertility, adoption, gestational carrier care, and menopause, with localized support in over 170 countries and 25 languages.

Learn more at

The Role

The Senior Data Engineer leads the evolution, reliability, and scalability of the modern data infrastructure across analytics, reporting, and business intelligence. This role architects, builds, and maintains robust, automated data pipelines, orchestrates workflows, integrates diverse data sources, and partners with stakeholders to deliver secure, compliant, high-quality data solutions. Responsibilities span the full data lifecycle, from architecting resilient ETL/ELT pipelines and developing real-time and batch data integrations to automating reporting, optimizing technical debt, and enabling advanced analytics.

Senior Data Engineers are operational stewards and technical mentors, driving improvements in system maturity, cost efficiency, automation, and collaborative practices.

The Team

Our Data Engineering team develops and optimizes scalable data pipelines and cloud infrastructure to enable secure, automated reporting and analytics. We collaborate with partner, product, finance, and legal teams to deliver reliable integrations and self-serve solutions that drive business insights and operational efficiency. The team leads technical incident management, data governance, and ongoing improvements while supporting cross-functional mentorship and stakeholder needs.

Responsibilities

The role includes, but is not limited to, architecting, building, and maintaining robust data pipelines; automating reporting; real-time and batch data integration; and enabling advanced analytics. It also involves driving improvements in automation, cost efficiency, and data governance in collaboration with stakeholders.

Minimum Qualifications
  • Expert-level proficiency in Snowflake, dbt, and Python for data modeling, transformation, pipeline development, and advanced analytics.
  • Advanced SQL skills for complex querying, large-scale data model design, and database optimization.
  • Experience architecting, building, testing, deploying, and maintaining scalable, automated ETL/ELT pipelines using orchestration tools such as Prefect and Airflow.
  • Hands-on administration of cloud-based data warehouse and lake platforms—Snowflake, AWS (S3, Redshift), Google Cloud—including integration of new sources and performance optimization.
  • Strong understanding of secure external data flows through SFTP,  (Next-Generation MFT), SFTP & File Sharing, and APIs, with a focus on reliability and compliance.
  • Mastery of version control (Git, Git Hub) and structured workflow management (Jira) for code review, audit trails, and operational transparency.
  • Ability to automate and simplify complex manual tasks, building reusable pipeline components that reduce operational overhead.
  • Depth in orchestrating workflows with custom scheduling, alerting, dependency management, monitoring, and error resolution.
  • Dependability, adaptability, and a collaborative approach to engineering, thriving in dynamic and ambiguous environments.
Preferred Qualifications
  • Experience in dynamic, fast-paced settings (startup or rapid-growth) delivering technical solutions under shifting priorities.
  • Background supporting cross-functional teams—business intelligence, analytics, and data science—enabling large-scale integrations, multi-tenant reporting, and advanced analytics tooling.
  • Familiarity with both batch and event-driven ingestion paradigms, including near-real-time data pipelines with Snowflake, dbt, and Python.
  • Experience automating complex operational, financial, or compliance-driven workflows within cloud data environments.
  • Exposure to audit, privacy, and compliance frameworks (SOC, HIPAA, GDPR, ISO, SOX) and data governance and secure access controls.
  • Exper…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary