Principal Cloud Data Engineer
Listed on 2025-12-02
-
IT/Tech
Data Engineer, Data Science Manager -
Engineering
Data Engineer, Data Science Manager
Surescripts serves the nation through simpler, trusted health intelligence sharing, in order to increase patient safety, lower costs and ensure quality care. We deliver insights at critical points of care for better decisions - from streamlining prior authorizations to delivering comprehensive medication histories to facilitating messages between providers.
Job SummaryThe Principal Cloud Data Engineer plays a key role in designing, building, and maintaining data pipelines and infrastructure using Google Cloud Platform (GCP) Big Query. The incumbent will collaborate with data analysts, data scientists, and data engineers to ensure timely access to high-quality data for data-driven decision making across the organization and data products.
The Principal Cloud Data Engineer is a highly technical person with hands-on experience in building data engineering processing solutions and scalable data pipelines to support analytics and exploratory analysis. The incumbent is a key contributor to advancing data engineering practices within the organization. This role will define the data engineering practices, patterns, and standards for the team, data engineering roadmap for the data lake and semantic layer, align data architecture to reporting initiatives with business goals, and foster a data-driven culture.
The incumbent will lead by example and mentor a team of Data Engineers, nurturing their skills and ensuring high-quality output while leveraging an iterative delivery approach using agile routines and methodologies to enable the team to focus.
This is a Principal Data Engineering role within Data & Analytics’ Data Core organization working closely with leaders of Data & Analytics. This role is responsible for continually improving the business’s data and analytic solutions, processes, and data engineering capabilities. The incumbent embraces industry best practices and trends and, through acquired knowledge, drives process and system improvement opportunities.
Responsibilities- Design, develop, and maintain robust data architectures and data pipelines to handle diverse and high-volume datasets efficiently.
- Prioritize scalability, performance optimization, and resilience when designing data pipeline infrastructure to support our growing needs.
- Establish and enforce strong data governance and quality control measures for data engineering to ensure the accuracy, integrity, and usability of our data assets.
- Champion data engineering standards and best practices across the organization, ensuring consistency and maintainability in our data solutions.
- Guide, coach, and develop junior/senior data engineers, fostering a collaborative learning environment and elevating the team's overall capabilities.
- Collaborate closely with data scientists, analysts, and key stakeholders to translate business objectives into actionable data engineering solutions.
- Stay abreast of emerging trends in data engineering. Experiment, evaluate, and integrate new technologies to continuously enhance our data platforms.
- Lead and oversee complex data engineering projects, ensuring timely delivery, high-quality standards, and effective collaboration across cross-functional teams.
- Bachelor's degree or equivalent experience in Computer Science, Mathematics, Information Technology or related field.
- 8+ years of solid hands-on experience as a Data Engineer, demonstrating increasing levels of responsibility and technical leadership.
- Strong understanding of data warehousing concepts and data modeling principles.
- Proven experience with designing and implementing data pipelines using GCP Big Query or cloud platform.
- Strong SQL and scripting languages like Python (or similar) skills.
- Expert knowledge of data quality tools and techniques.
- Excellent communication and collaboration skills.
- Ability to work independently and as needed part of a team.
- Strong problem-solving and analytical skills.
- Passion for data and a desire to learn and adapt to new technologies.
- Experience with other GCP services like Cloud Storage, Dataflow, Dataproc, Bigquery and Pub/Sub or similar cloud platforms.
- Experience with cloud deployment and automation…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).