Lead Software Engineer, Data Platform
Listed on 2025-12-27
-
IT/Tech
Data Engineer, Cloud Computing, AI Engineer
To get the best candidate experience, please consider applying for a maximum of 3 roles within 12 months to ensure you are not duplicating efforts.
Job Category:
Software Engineering
Salesforce is the #1 AI CRM, where humans with agents drive customer success together. Here, ambition meets action. Tech meets trust. And innovation isn’t a buzzword - it’s a way of life. The world of work as we know it is changing and we’re looking for Trailblazers who are passionate about bettering business and the world through AI, driving innovation, and keeping Salesforce’s core values at the heart of it all.
Ready to level-up your career at the company leading workforce transformation in the agentic era? You’re in the right place! Agentforce is the future of AI, and you are the future of Salesforce.
We are looking for exceptional Lead Engineers to build the engine that powers Salesforce’s enterprise intelligence. In this role, you will be a hands‑on technical contributor responsible for modernizing our core data ecosystem. You will move beyond simple ETL scripts to build a robust, software‑defined Data Mesh using Snowflake, dbt, Airflow, and Informatica.
You will bridge the gap between “Data Engineering” and “Software Engineering” – treating data pipelines as production code, automating infrastructure with Terraform, and optimizing high‑scale distributed systems to enable AI and analytics across the enterprise.
Key Responsibilities Core Platform Engineering & Architecture- Build & Ship:
Design and implement scalable data pipelines and transformation logic using Snowflake (SQL) and dbt. Replace legacy hard‑coded scripts with modular, testable, and reusable data components. - Orchestration:
Engineer robust workflows in Airflow. Write custom Python operators and ensure DAGs are dynamic, factory‑generated, and resilient to failure. - Performance Tuning:
Own the performance of your datasets. Deep‑dive into query profiles, optimize pruning/clustering in Snowflake, and reduce credit consumption while improving data freshness.
- Infrastructure as Code:
Manage the underlying platform infrastructure (warehouses, roles, storage integration) using Terraform or Helm. - CI/CD & Quality:
Enforce a strict “Data Ops” culture. Ensure every pull request has unit tests, schema validation, and automated deployment pipelines. - Reliability (SRE):
Build monitoring and alerting (Monte Carlo, Grafana, New Relic, Splunk) to detect data anomalies before stakeholders do.
- Data Mesh Implementation:
Work with domain teams (Sales, Marketing, Finance) to onboard them to the platform, helping them decentralize their data ownership while adhering to platform standards. - AI Readiness:
Prepare structured data for AI consumption, ensuring high‑quality, governed datasets are available for LLM agents and advanced analytics models. - Focus:
System Design & Technical Leadership:
Proactively identify problems (e.g., “Our ingestion pattern won’t scale 10×”) and design the architectural solution. Lead the technical direction for a squad. - Scope:
Own entire subsystems or domain architectures. Serve as the “Tech Lead” for a group of engineers, driving technical consensus, RFCs, and coordinating cross‑team dependencies.
- Engineering Roots:
Strong background in software engineering (Python/Java/Go) applied to data. Comfortable writing custom API integrations and complex Python scripts. - The Modern Stack:
Deep production experience with Snowflake (architecture/tuning) and dbt (Jinja/Macros/Modeling). - Workflow Orchestration:
Advanced proficiency with Airflow (Managed Workflows for Apache Airflow). - Cloud Native:
Hands‑on experience with AWS services (S3, Lambda, IAM, ECS) and containerization (Docker/Kubernetes). - Dev Ops Mindset:
Experience with Git, CI/CD (Git Hub Actions/Jenkins) and Terraform. - Experience Requirements: 8+ years of experience, with a proven track record of leading technical projects or small teams.
- Knowledge Graph
Experience:
Familiarity with Graph Databases (Neo4j) or Semantic Standards (RDF/SPARQL, Top Quadrant) is a strong plus. - Open Table…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).