Senior Data Engineer
Listed on 2025-12-01
-
IT/Tech
Data Engineer
Join to apply for the Senior Data Engineer role at Snap Code Inc
.
Title: Senior Data Engineer
Location: Pleasanton, California (hybrid work)
As a Senior/Lead Data Engineer, you will lead the design, development, and ownership of core data infrastructure from pipelines to storage to data products. You'll be a strategic partner across teams, ensuring that our data systems are robust, scalable, and optimized for performance. With executive visibility and deep cross-functional collaboration, the solutions you build will directly influence product strategy and operational excellence.
This is a unique opportunity to build from the ground up while working with cutting‑edge technologies such as Postgres, dbt, Snowflake, and modern orchestration frameworks.
- Architect, design, and implement scalable ELT pipelines using Snowflake, dbt, and Postgres.
- Optimize data models in both Snowflake (cloud DW) and Postgres (transactional/operational data).
- Implement advanced Snowflake features (Snowpipe, Streams, Tasks, Dynamic Tables, RBAC, Security).
- Design and maintain hybrid pipelines (Postgres‑Snowflake) for seamless data integration.
- Establish data quality and testing frameworks using dbt tests and metadata‑driven validation.
- Implement CI/CD workflows (Git, Git Hub Actions, or similar) for dbt/Snowflake/Postgres projects.
- Drive observability, monitoring, and performance tuning of pipelines (logs, lineage, metrics).
- Provide technical leadership and mentorship to engineers and analysts.
- Collaborate with Finance, Product, Marketing, and GTM teams to deliver trusted, business‑critical data models.
- Support financial data processes (consolidation, reconciliation, close automation).
- Evaluate and experiment with emerging AI and data technologies, providing feedback to influence product direction.
- Experience:
8+ years in Data Engineering, including 3+ years in Snowflake & dbt. - Database Expertise:
- Deep hands‑on experience with dbt (Core/Cloud) macros, testing, documentation, and packages.
- Strong expertise in Postgres (schema design, optimization, stored procedures, large‑scale workloads).
- Advanced knowledge of Snowflake (data modeling, performance tuning, governance).
- Programming:
Proficient in SQL and Python, including API integrations and automation. - Orchestration & ETL:
Hands‑on with Airflow, Dagster, Prefect (or similar), and ETL/ELT tools like Fivetran, Nifi. - Data Architecture:
Strong understanding of data warehousing, dimensional modeling, medallion architecture, and system design principles. - Cloud:
Experience with AWS (mandatory); GCP or Azure is a plus. - Dev Ops:
Experience with Git/Git Ops CI/CD pipelines for data workflows. - Leadership:
Proven ability to mentor teams, collaborate cross‑functionally, and deliver impact in fast‑paced environments. - Communication:
Excellent written and verbal communication skills.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).