×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Senior Data Engineer

Job in Pleasanton, Alameda County, California, 94566, USA
Listing for: SnapCode Inc
Full Time position
Listed on 2025-12-01
Job specializations:
  • IT/Tech
    Data Engineer
Job Description & How to Apply Below

Join to apply for the Senior Data Engineer role at Snap Code Inc
.

Title: Senior Data Engineer
Location: Pleasanton, California (hybrid work)

Role Overview

As a Senior/Lead Data Engineer, you will lead the design, development, and ownership of core data infrastructure from pipelines to storage to data products. You'll be a strategic partner across teams, ensuring that our data systems are robust, scalable, and optimized for performance. With executive visibility and deep cross-functional collaboration, the solutions you build will directly influence product strategy and operational excellence.

This is a unique opportunity to build from the ground up while working with cutting‑edge technologies such as Postgres, dbt, Snowflake, and modern orchestration frameworks.

Key Responsibilities
  • Architect, design, and implement scalable ELT pipelines using Snowflake, dbt, and Postgres.
  • Optimize data models in both Snowflake (cloud DW) and Postgres (transactional/operational data).
  • Implement advanced Snowflake features (Snowpipe, Streams, Tasks, Dynamic Tables, RBAC, Security).
  • Design and maintain hybrid pipelines (Postgres‑Snowflake) for seamless data integration.
  • Establish data quality and testing frameworks using dbt tests and metadata‑driven validation.
  • Implement CI/CD workflows (Git, Git Hub Actions, or similar) for dbt/Snowflake/Postgres projects.
  • Drive observability, monitoring, and performance tuning of pipelines (logs, lineage, metrics).
  • Provide technical leadership and mentorship to engineers and analysts.
  • Collaborate with Finance, Product, Marketing, and GTM teams to deliver trusted, business‑critical data models.
  • Support financial data processes (consolidation, reconciliation, close automation).
  • Evaluate and experiment with emerging AI and data technologies, providing feedback to influence product direction.
Requirements
  • Experience:

    8+ years in Data Engineering, including 3+ years in Snowflake & dbt.
  • Database Expertise:
    • Deep hands‑on experience with dbt (Core/Cloud) macros, testing, documentation, and packages.
    • Strong expertise in Postgres (schema design, optimization, stored procedures, large‑scale workloads).
    • Advanced knowledge of Snowflake (data modeling, performance tuning, governance).
  • Programming:
    Proficient in SQL and Python, including API integrations and automation.
  • Orchestration & ETL:
    Hands‑on with Airflow, Dagster, Prefect (or similar), and ETL/ELT tools like Fivetran, Nifi.
  • Data Architecture:
    Strong understanding of data warehousing, dimensional modeling, medallion architecture, and system design principles.
  • Cloud:
    Experience with AWS (mandatory); GCP or Azure is a plus.
  • Dev Ops:
    Experience with Git/Git Ops CI/CD pipelines for data workflows.
  • Leadership:
    Proven ability to mentor teams, collaborate cross‑functionally, and deliver impact in fast‑paced environments.
  • Communication:
    Excellent written and verbal communication skills.
#J-18808-Ljbffr
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary