Sr. Data Engineer
Listed on 2025-12-02
-
IT/Tech
Data Engineer
Join PatientPoint to be part of a dynamic team committed to empower better health. As a leading digital health company, we innovate to positively impact patient behaviors. Our purpose-driven approach offers an inspirational career opportunity where you can contribute to improving health outcomes for millions of patients nationwide.
It is an exciting time to be a member of PatientPoint’s D ata team! Join our expanding the Data team in a dynamic and rapidly evolving domain where collaboratio n isn’t just a buzzword --- it’s our way of life ! Working closely with stakeholders and customers, we've built a self-service data platform that delivers critical insights for informed decision-making across our digital health engagement solutions.
In the Data Team we thrive on the collective synergy of diverse minds coming together to tackle challenges and foster an environment of openness that nurtures individual growth where every voice is valued, ideas are shared freely, and teamwork reigns supreme. Together, we're not just building data solutions; we're forging bonds, fostering growth, and achieving remarkable outcomes.
Hybrid Schedule: 3 days in office
Travel Requirements: 3 to 5 times per year
Job Summary
Step into a culture where data is embraced as an asset and everyone, from top executives to frontline team members, understands its importance in achieving strategic objectives. As a Senior Data Analytics Engineer on our hybrid agile scrum team, your responsibilities will be end-to-end encompassing design, analysis, build, orchestration and automation, monitoring performance optimizations, data quality, solution accuracy and compliance throughout the lifecycle.
What You’ll Do
- Responsible for architecting end-to-end data solutions that meet our business partners' expectations and integrate into our Data Platform.
- Engage with the Data team using scrum framework to comprehensively grasp requirements for each deliverable.
- Accountable to follow Data team’s best practices in solution design, build, orchestration and automation, data ingest, monitoring performance optimizations, data quality, solution accuracy and compliance throughout the lifecycle.
- Leverage a modern tool stack including Snowflake, Atlan, Fivetran, Docker, AWS, and Astronomer (Airflow) to cultivate an environment where analysts and data engineers can autonomously enact changes in an automated, thoroughly tested and high-quality manner.
- Lead design, development, prototyping, operations and implementation of data solutions and pipelines.
- Analyze the impact of changes to downstream systems/products and recommend alternatives to minimize the impact.
- Build a deep knowledge in each PatientPoint data information domain.
- Responsible for testing and release process for data pipelines using best practices for frequent releases.
- Participate in Code Reviews
- Mentor and support Data team members new to integrating the modern stack.
- Partner to deliver a data model that aligns with Dev Ops principles, ensuring and standards for continuous integration/ continuous delivery (CI/CD) processes.
- Drive Results by motivating self and others to exceed goals and achieve breakthrough results while exhibiting persistence to remove barriers to achieving results.
- Develop and maintain data documentation, including data dictionaries, data lineage, and data flow diagrams, best practices and data recovery processes to provide clear visibility into the data ecosystem.
We Need
- 5+ years of experience working on cloud data warehouses and data pipelines with a focus on data engineering, building scalable, sustainable and secure data platforms powering intelligent applications.
- Bachelor's d egree in Informatics , Business Technology, Analytics, Computer Science or a related field .
- 2+ years of h ands-on experience in working with dbt
- 3+ years of hands-on experience with Snowflake.
- Advanced experience in a Data Engineering and ELT Engineering role.
- Expert
-level proficiency in SQL query and stored procedure development. - Competent with Python, Airflow, Git Hub and DAG construction .
- Experience with unstructured datasets and ability to handle Parquet, JSON , AVRO and XML file formats .
- Strong…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).