×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Senior Data Engineer

Remote / Online - Candidates ideally in
South San Francisco, San Mateo County, California, 94083, USA
Listing for: Ccrps
Full Time, Remote/Work from Home position
Listed on 2025-12-06
Job specializations:
  • IT/Tech
    Data Engineer
Salary/Wage Range or Industry Benchmark: 155000 - 170000 USD Yearly USD 155000.00 170000.00 YEAR
Job Description & How to Apply Below

At Veracyte, we offer exciting career opportunities for those interested in joining a pioneering team that is committed to transforming cancer care for patients across the globe. Working at Veracyte enables our employees to not only make a meaningful impact on the lives of patients, but to also learn and grow within a purpose driven environment. This is what we call the Veracyte way.

It’s about how we work together, guided by our values, to give clinicians the insights they need to help patients make life‑changing decisions.

Our Values:

  • We Seek A Better Way
    :
    We innovate boldly, learn from our setbacks, and are resilient in our pursuit to transform cancer care
  • We Make It Happen
    :
    We act with urgency, commit to quality, and bring fun to our hard work
  • We Are Stronger Together
    :
    We collaborate openly, seek to understand, and celebrate our wins
  • We Care Deeply
    :
    We embrace our differences, do the right thing, and encourage each other
The Position:

The Senior Data Engineer will contribute to Veracyte’s success by designing, developing, and maintaining scalable cloud data infrastructure and pipelines to support the company’s data engineering needs. This role involves hands‑on work with data lakes, meshes, and catalogs, collaborating with cross‑functional teams in a Scrum environment to deliver high‑quality data solutions. The Senior Data Engineer will support the implementation of data management frameworks and align with Veracyte’s global data strategy, policies, and digital transformation initiatives, including the Veracyte Lakehouse built on AWS and Snowflake.

The position is based out of our San Diego office (hybrid) and we are also open to US Remote (working PST hours).

Key Responsibilities:

  • Design and Develop Data Infrastructure
    :
    • Build and maintain scalable, efficient data pipelines and infrastructure for Lakehouse systems, including bronze, silver, and gold data layers.
    • Work with technologies such as Amazon S3, Snowflake, AWS Glue, Lake Formation, and Sage Maker for data storage, processing, and analytics.
  • Collaborate Across Teams
    :
    • Partner with the Technical Program Manager (TPM), data scientists, and stakeholders to understand business requirements and translate them into technical data solutions.
    • Participate in Scrum processes, including backlog grooming, sprint planning, and handling data set requests via Jira.
  • Optimize and Secure Data
    :
    • Optimize data retrieval, processing, and ELT workflows for improved performance and reliability.
    • Implement data security measures, governance policies, and compliance with PHI, consent, and regulatory requirements.
  • Support Data Management Initiatives
    :
    • Assist in identifying and assessing internal and external data sources for the data catalog.
    • Contribute to the evaluation, development, or integration of user‑friendly data catalog applications aligned with best practices.
    • Help provide training and support to users of the data catalog.
  • Contribute to Data Strategy
    :
    • Provide technical input to support the development and implementation of Veracyte’s data strategy and policies.
    • Collaborate on defining user stories, data quality levels (e.g., Medallion architecture), and access controls for datasets.
    • Support data acquisition, curation, and delivery for use cases like AI model training, clinical decision support, and operational efficiency.
  • Mentorship and Knowledge Sharing
    :
    • Mentor junior data engineers and foster a culture of continuous learning.
    • Share expertise in data engineering best practices, emerging technologies, and tools like Apache Parquet, Iceberg, and Zero‑ETL integrations.
  • Who You Are:
    • Education
      :
      Bachelors or Masters degree in Engineering, Computer Science, or a related field.
    • Experience
      :
      • 6+ years of experience (BS) or 3+ years (MS) in data engineering or a similar role.
      • Hands‑on experience with designing and deploying data pipelines in cloud environments, preferably AWS and/or GCP.
    • Technical Skills
      :
      • Proficiency in programming languages such as Python, Java, or Scala.
      • Experience with AWS services (S3, Glue, Lake Formation, Sage Maker) and Snowflake for data warehousing, ELT processes, and data modeling.
      • Familiarity with data cataloging tools, data lakes, and governance best practices.
      • Knowled…
    Position Requirements
    10+ Years work experience
    To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
    (If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
     
     
     
    Search for further Jobs Here:
    (Try combinations for better Results! Or enter less keywords for broader Results)
    Location
    Increase/decrease your Search Radius (miles)

    Job Posting Language
    Employment Category
    Education (minimum level)
    Filters
    Education Level
    Experience Level (years)
    Posted in last:
    Salary