×
Register Here to Apply for Jobs or Post Jobs. X

Senior Software Engineer, Data Platform

Job in Seattle, King County, Washington, 98127, USA
Listing for: salesforce.com, inc.
Full Time position
Listed on 2026-01-01
Job specializations:
  • IT/Tech
    Data Engineer, AI Engineer
Job Description & How to Apply Below
About Salesforce

Salesforce is the #1 AI CRM, where humans with agents drive customer success together. Here, ambition meets action. Tech meets trust. And innovation isn't a buzzword - it's a way of life. The world of work as we know it is changing and we're looking for Trailblazers who are passionate about bettering business and the world through AI, driving innovation, and keeping Salesforce's core values at the heart of it all.

Ready to level-up your career at the company leading workforce transformation in the agentic era? You're in the right place! Agentforce is the future of AI, and you are the future of Salesforce.

We are looking for exceptional Senior Engineers to build the engine that powers Salesforce's enterprise intelligence. In this role, you will be a hands-on technical contributor responsible for modernizing our core data ecosystem. You will move beyond simple ETL scripts to build a robust, software-defined Data Mesh using Snowflake, dbt, Airflow, and Informatica. You will bridge the gap between "Data Engineering" and "Software Engineering" - treating data pipelines as production code, automating infrastructure with Terraform, and optimizing high-scale distributed systems to enable AI and Analytics across the enterprise.

Key Responsibilities

Core Platform Engineering & Architecture

• Build & Ship:
Design and implement scalable data pipelines and transformation logic using Snowflake (SQL) and dbt. Replace legacy hardcoded scripts with modular, testable, and reusable data components.

• Orchestration:
Engineer robust workflows in Airflow. Write custom Python operators and ensure DAGs are dynamic, factory-generated, and resilient to failure.

• Performance Tuning:
Own the performance of your datasets. Deep dive into query profiles, optimize pruning/clustering in Snowflake, and reduce credit consumption while improving data freshness.

Dev Ops, Reliability & Standards

• Infrastructure as Code:
Manage the underlying platform infrastructure (warehouses, roles, storage integration) using Terraform or Helm. Click-ops is not an option.

• CI/CD & Quality:
Enforce a strict "Data Ops" culture. Ensure every PR has unit tests, schema validation, and automated deployment pipelines.

• Reliability (SRE):
Build monitoring and alerting (Monte Carlo, Grafana, Newrelic, Splunk) to detect data anomalies before stakeholders do.

Collaboration & Modernization

• Data Mesh Implementation:
Work with domain teams (Sales, Marketing, Finance) to onboard them to the platform, helping them decentralize their data ownership while adhering to platform standards.

• AI Readiness:
Prepare structured data for AI consumption, ensuring high-quality, governed datasets are available for LLM agents and advanced analytics models.

• Focus:
Execution & Component Ownership. You are given a problem (e.g., "Migrate this domain to dbt," "Optimize this slow pipeline") and you solve it with high-quality, clean code with minimal supervision.

• Scope:
You own features and specific pipelines. You mentor junior engineers on code reviews and best practices.

What We're Looking For

Core Qualifications

• Engineering Roots:
Strong background in software engineering (Python/Java/Go) applied to data. You are comfortable writing custom API integrations and complex Python scripts.

• The Modern Stack:
Deep production experience with Snowflake (architecture/tuning) and dbt (Jinja/Macros/Modeling).

• Workflow Orchestration:
Advanced proficiency with Airflow (Managed Workflows for Apache Airflow).

• Cloud Native:
Hands-on experience with AWS services (S3, Lambda, IAM, ECS) and containerization (Docker/Kubernetes).

• Dev Ops Mindset:

Experience with Git, CI/CD (Git Hub Actions/Jenkins), and Terraform.

Experience Requirements

• 5+ years of relevant data or software engineering experience.

Nice to Have

• Knowledge Graph

Experience:

Familiarity with Graph Databases (Neo4j) or Semantic Standards (RDF/SPARQL, Top Quadrant) is a strong plus as we integrate these technologies into the platform.

• Open Table Formats:

Experience with Apache Iceberg or Delta Lake.

• Streaming:

Experience with Kafka or Snowpipe Streaming.

• AI Integration:
Experience using AI coding assistants (Copilot,…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary