More jobs:
Principal Data Modeler
Job in
San Francisco, San Francisco County, California, 94199, USA
Listed on 2026-01-03
Listing for:
salesforce.com, inc.
Full Time
position Listed on 2026-01-03
Job specializations:
-
IT/Tech
Data Engineer, Data Analyst
Job Description & How to Apply Below
To get the best candidate experience, please consider applying for a maximum of 3 roles within 12 months to ensure you are not duplicating efforts.
Job Category:
Data
Job Details
Salesforce is the #1 AI CRM, where humans with agents drive customer success together. Here, ambition meets action. Tech meets trust. And innovation isn't a buzzword – it's a way of life. The world of work as we know it is changing and we’re looking for Trailblazers who are passionate about bettering business and the world through AI, driving innovation, and keeping Salesforce's core values at the heart of it all.
Agentforce is the future of AI, and you are the future of Salesforce.
- Design and implement a robust data model that integrates data from core B2B systems, including Snowflake, Salesforce Data 360, multiple Salesforce orgs, Informatica MDM, and Amazon data lakes.
- Design and evolve scalable end-to-end data architecture; define standards for data modeling, ingestion framework, pipelines, data quality, etc.
- Architect tables and views to clearly define and calculate critical metrics (e.g., lead conversion, MQL, marketing driven pipe, ROI).
- Translate business needs for marketing performance measurement, customer segmentation, targeting, and personalization into precise data requirements and model designs. Translate functional and non-functional requirements (e.g., analytical performance, query latency, automation throughput) into optimal logical, conceptual, and physical data model designs.
- Partner with Data Engineering to design data models that leverage advanced Snowflake features (e.g., clustering keys, materialized views, micro-partitions, time travel) to optimize query performance and cost efficiency.
- Master the benefits and trade-offs of modeling on each platform, such as leveraging Snowflake's zero‑copy data sharing vs. federating queries to S3.
- Enforce rigorous data cataloging and metadata standards to ensure all marketing metrics have a single, unambiguous definition across the organization.
- Collaborate with other Data and Application Architects to ensure the data warehouse model aligns with the overall enterprise data strategy and upstream/downstream system architectures.
- Ensure the data model is intuitive and accessible for all Data Scientists, Analysts, Data and BI Engineers who build curated datasets, predictive models, and dashboards to measure and optimize marketing performance.
- Master's or Ph.D in Computer Science, Information Systems, or a related quantitative field.
- 10+ years of hands‑on data modeling, data architecture, or database design experience.
- 5+ years of experience designing and implementing large‑scale Enterprise Data Warehouses.
- Expert‑level knowledge of dimensional modeling (Star/Snowflake schemas) and its application to business intelligence, reporting, and machine learning workloads including feature engineering for attribution models, lead scoring, and propensity models.
- Extensive experience with marketing data domains (e.g., campaign management, CRM, web analytics, attribution/marketing mix modeling, propensity modeling, forecasting, and optimization). Demonstrated ability to model complex business processes, including slowly changing dimensions and historical data tracking.
- Proven, hands‑on experience building and optimizing data models on a modern, cloud‑native data warehouse platform, with deep expertise in Snowflake.
- Advanced proficiency with SQL and DDL/DML, especially optimized for the Snowflake ecosystem. Familiarity with ETL tools (e.g., dbt, Fivetran), cloud services (AWS, GCP or Azure), and how to design data models that optimize their performance.
- Expert‑level mastery of all major data modeling methodologies and implementation trade‑offs between them such as 3NF (for applications), Data Vault (for integration layers), and Star/Snowflake schemas (for data science).
- Deep experience modeling Master Data Management golden records and hierarchies, and integrating them with operational and analytical systems (e.g., Informatica MDM).
- Experience implementing Data Mesh principles: domain ownership of data products, "data as a product" mindset with…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×