Senior Cloud DataOps Engineer
Listed on 2025-10-16
-
IT/Tech
Data Engineer, Cloud Computing, Data Security, Data Analyst
Surescripts serves the nation through simpler, trusted health intelligence sharing, in order to increase patient safety, lower costs and ensure quality care. We deliver insights at critical points of care for better decisions - from streamlining prior authorizations to delivering comprehensive medication histories to facilitating messages between providers.
Job SummaryThe Senior Cloud Data Ops Engineer plays a pivotal role in designing, building, and maintaining scalable, reliable, cloud infrastructure on the Google Cloud Platform (GCP) to support data and analytics workflows, data products as well as data pipelines. The Senior Cloud Data Ops ensures seamless data flow, optimizes performance, configures cloud infrastructure and drives data-driven decision-making within our organization. The incumbent is required to work within cloud environments with complex data interdependencies, visibility constraints, privacy protections, and security protocols.
This is a senior Data Ops engineering role within Data & Analytics' Data Core organization working closely with leaders of the Data & Analytics and Network Technology & Operations departments. The incumbent will build close partnership and relationships with system operations and Dev Ops teams, data engineers, data scientists and analysts to build and configure cloud infrastructure to support scalable data processing, analytics and machine learning workloads.
The incumbent will lead efforts around Data Ops process improvement to enhance team effectiveness in leveraging advanced cloud computing and storage techniques in our Google Cloud environment.
Data Transformations and Data Fusion:Support Cloud components for data processing (e-g, operational data pipelines for ETL/ELT, using Python, SQL, and other relevant cloud technologies.) from various sources, by ensuring monitoring, logging, alerting and automations components are configured and available for data engineering teams.
Data Cleansing, De-Identification, Data Masking
:
Use appropriate cloud infrastructure components and functions to support data accuracy and consistency by enabling and configuring tools and processes to protect sensitive information by removing or obfuscating personal identifiers, and securing data by replacing sensitive information with cryptographic hashes.Containerization
:
Leverage containers and containerization technologies such as Kubernetes, GKE, Data Proc or similar cloud compute orchestration.Cloud Processing Composition:Setup and support Composer or (Apache) Airflow, or similar cloud/cluster processing task orchestration experience.
Infrastructure Management
:
Provision, configure, and manage GCP resources using IaC (Terraform) e-g, resource provisioning, virtual machines, virtual private cloud, cloud storage. Cloud SQL, automations, monitoring, logging, security components (IAM, Firewall, Encryption) support data processing and analytics workloads and solutions.Dev Ops Practices
:
Implement and promote Dev Ops principles, including CI/CD pipelines, infrastructure as code (IaC), and monitoring and alerting in partnership with Network Technology and Operations Department's System Operations function.Data Quality and Governance
:
Implement, configure and integrate data governance tools e-g Informatica, data quality observability tools e-g Bigeye. Google Dataplex Data Quality.Performance Optimization
:
Analyze and optimize data pipeline Infrastructure components, analytics application performance, identifying bottlenecks and implementing strategies to improve efficiency.Troubleshooting and Support
:
Diagnose and resolve infrastructure issues promptly, providing technical support to data engineers, data scientists and BI analysts.Continuous Learning:Stay abreast of industry trends, emerging technologies, and best practices in cloud data operations.
Understand all applicable data privacy and security laws, rules, regulations, and contractual restrictions, and follow all Surescripts data governance and data usage rights policies and procedures.
Basic Requirements:
Bachelor's degree in Computer Science, Data Science, or other related field; or…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).