Data Platform Engineer
Listed on 2026-02-08
-
IT/Tech
Cloud Computing, Data Engineer
Overview
Schwab Asset Management (SAM) is a leading asset manager supporting mutual funds, ETFs, and managed account products governed under stringent regulatory and compliance requirements. SAM operates in a multi‑cloud, multi‑custodian, multi‑vendor ecosystem
, relying on a diverse set of external platforms such as Vestmark, Aladdin, Eagle, and others to serve its investment, operational, and regulatory functions.
This role sits directly within SAM Data Platform Engineering
, the team responsible for designing, building, operating, and enhancing the shared platform capabilities underpinning SAM Data platform. As a junior platform engineer, you will help expand and operate the cloud infrastructure, CI/CD tooling, orchestration layers, and Snowflake resources that support all SAMDA tenants.
Role Summary
The SAM Data Platform Engineering team builds and operates the Schwab Asset Management Data platform — a unified, multi‑tenant, cloud‑native data platform that supports regulatory, operational, and analytical workloads across SAM. The platform leverages GCP services and Snowflake to deliver scalable data ingestion, transformation, governance, and consumption capabilities.
As a Data Platform Engineer (Level 55), you will contribute to the build‑out, automation, and operation of the SAMDA platform. This junior role is ideal for early‑career engineers who want to work across cloud infrastructure, Dev Ops tooling, Snowflake platform configuration, and data‑pipeline enablement while learning modern cloud‑native data engineering patterns.
What You Will Do (Responsibilities)Platform Infrastructure & Environment Engineering
- Support creation and configuration of GCP infrastructure including Cloud Storage, Composer, Cloud Run, and IAM roles.
- Assist in provisioning Snowflake resources (databases, schemas, compute warehouses, RBAC roles, service accounts) aligned with tenant isolation and platform governance models.
- Follow platform patterns for networking and secure connectivity
, including Private Service Connect and controlled access paths between Snowflake and GCP.
Dev Ops, CI/CD & Automation
- Contribute to Git repository setup
, branching strategies, and automated CI/CD pipelines for pipelines, Snowflake DDL, configuration, and platform components. - Help build automation templates for tenant‑specific resources, notifications, dashboards, and deployment patterns.
Data Platform Capabilities
- Support ingestion workflows for file‑based ingestion
, vendor Snowflake secure data shares
, and internal source ingestion to SAMDA raw/curated data zones. - Assist in implementing transformations using Snowflake SQL, Python, and Composer as part of the PLT (Push‑Load‑Transform) orchestration model.
- Support standardized data quality checks baked into platform pipelines and tenant workflows.
Observability, Monitoring & Platform Operations
- Configure monitoring and alerting using GCP Operations
, Snowflake usage monitoring, and platform dashboards for pipeline health and SLAs. - Collaborate with production support on incident triage, pipeline monitoring, and environment troubleshooting.
Platform Enablement
- Help onboard new tenant applications, including infra creation, repository setup, Snowflake footprint provisioning, RBAC setup, and pipeline patterns.
- Learn and apply platform governance covering data segmentation, tenancy isolation, information barriers, and security guardrails.
Required Qualifications
- Bachelor’s degree in Computer Science, IT, Engineering, or related field, or equivalent practical experience.
- 1–2 years of experience or relevant project work with cloud platforms (GCP, AWS, Azure).
- Foundational experience with Python, SQL, Git, and CI/CD concepts.
- Exposure to cloud‑native services (GCS, Cloud Run, Composer, Pub/Sub) or Snowflake.
- Strong analytical, troubleshooting, and communication skills; ability to learn quickly and collaborate in a team environment.
Preferred Qualifications
- Exposure to Snowflake or other cloud data warehouses (Big Query, Redshift).
- Experience with IaC tools such as Terraform or Deployment Manager.
- Understanding of data governance, secure data handling, or enterprise RBAC models.
- Familiarity with observability…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).