More jobs:
Senior Cloud Data Engineer
Remote / Online - Candidates ideally in
Minneapolis, Hennepin County, Minnesota, 55400, USA
Listed on 2025-10-16
Minneapolis, Hennepin County, Minnesota, 55400, USA
Listing for:
Surescripts
Full Time, Remote/Work from Home
position Listed on 2025-10-16
Job specializations:
-
IT/Tech
Data Engineer, Data Analyst, Big Data, Data Science Manager
Job Description & How to Apply Below
Home Office (Minnesota):
Raleigh, North Carolina:
Beaverton, Oregon:
Arlington, Virginia time type:
Full time posted on:
Posted Yesterday job requisition :
REQ
2888
Surescripts serves the nation through simpler, trusted health intelligence sharing, in order to increase patient safety, lower costs and ensure quality care. We deliver insights at critical points of care for better decisions — from streamlining prior authorizations to delivering comprehensive medication histories to facilitating messages between providers.#
*
* Job Summary:
** The Senior Cloud Data Engineer plays a key role in designing, building, and maintaining data pipelines and infrastructure using Google Cloud Platform (GCP) Big Query. The incumbent will collaborate with data analysts, data scientists, and other engineers to ensure timely access to high-quality data for data-driven decision-making across the organization.
The Senior Cloud Data Engineer is a highly technical person that has mastered hands-on coding in data processing solutions and scalable data pipelines to support analytics and exploratory analysis. This role ensures new business requirements are decomposed and implemented in the cohesive end-to-end designs that enable data integrity and quality, and best support BI and analytic capability needs that power decision-making at Surescripts.
This includes building data acquisition programs that handle the business’s growing data volume as part of the Data Lake in GCP Big Query ecosystem and maintaining a robust data catalog.
This is a Senior Data Engineering role within Data & Analytics’ Data Core organization working closely with leaders of the Data & Analytics. The incumbent will continually improve the business’s data and analytic solutions, processes, and data engineering capabilities. The incumbent embraces industry best practices and trends and, through acquired knowledge, drives process and system improvement opportunities.
#
** Responsibilities:
*** Design, develop, and implement data pipelines using GCP Big Query, Dataflow, and Airflow for data ingestion, transformation, and loading.
* Optimize data pipelines for performance, scalability, and cost-efficiency.
* Ensure data quality through data cleansing, validation, and monitoring processes.
* Develop and maintain data models and schemas in Big Query to support various data analysis needs.
* Automate data pipeline tasks using scripting languages like Python and tools like Dataflow.
* Collaborate with data analysts and data scientists to understand data requirements and translate them into technical data solutions.
* Leverage Dev Ops Terraform (IaC) to ensure seamless integration of data pipelines with CI/CD workflows.
* Monitor and troubleshoot data pipelines and infrastructure to identify and resolve issues.
* Stay up-to-date with the latest advancements in GCP Big Query and other related technologies.
* Document data pipelines and technical processes for future reference and knowledge sharing.#
*
* Qualifications:
**##
** Basic Requirements:
*** Bachelor’s degree or equivalent experience in Computer Science, Mathematics, Information Technology or related field.
* 5+ years of solid experience as a data engineer.
* Strong understanding of data warehousing / datalake concepts and data modeling principles.
* Proven experience with designing and implementing data pipelines using GCP Big Query, Dataflow and Aiflow.
* Strong SQL and scripting languages like Python (or similar) skills.
* Experience with data quality tools and techniques.
* Ability to work independently and as part of a team.
* Strong problem-solving and analytical skills.
* Passion for data and a desire to learn and adapt to new technologies.
* Experience with other GCP services like Cloud Storage, Dataflow, and Pub/Sub etc.
* Experience with cloud deployment and automation tools like Terraform.
* Experience with data visualization tools like Tableau or Power BI or Looker.
* Experience with healthcare data.
* Familiarity with machine learning, artificial intelligence and data science concepts.
* Experience with data governance and healthcare PHI…
Position Requirements
10+ Years
work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×