General Information:
Job Title:
Data Modeler
Location:
Toronto (Remote/ Hybrid)
Job Type: Contract for 12+ months
Reporting Line: SVP, Architecture
Salary Range: $95–$115 CAD per hour (negotiable)
About Fulfillment IQ (FIQ):Fulfillment IQ is a supply chain engineering and transformation company that helps brands, retailers, and 3PLs design, build, and scale high-performance logistics operations.
We work at the intersection of strategy, operations, and technology where we solve complex, real‑world problems across warehouse design, automation, order management, transportation, and end‑to‑end supply chain execution.
Our teams combine deep domain expertise with strong technical capability, delivering outcomes through consulting, systems implementation, and proprietary platforms that accelerate time‑to‑value and reduce delivery risk.
If you enjoy working in complex environments, partnering closely with clients, and seeing your work make a tangible impact on how global commerce moves, this is the place where your skills and judgment truly come to life.
Role Overview :We are seeking an experienced Data Modeler to design and implement the data models powering a multi‑site warehouse intelligence platform on Google Cloud Platform (GCP). The ideal candidate will have a strong background in data modeling, dimensional modeling, and a deep understanding of the supply chain and logistics domains. The role requires a hands‑on approach, with a focus on designing data
models that bridge multiple warehouse management systems, data lake house architectures, and real‑time operational data stores.
Must Have:- 6+ years of experience in data modeling roles, including logical, physical, dimensional, and domain modeling
- 3+ years of experience with Snowflake, including data engineering, data modeling, and data warehousing
- Supply chain/logistics/warehousing domain knowledge, including warehouse data: inventory lifecycle, order management, WMS transactions, shipping/receiving, labor tracking
- SQL expertise, including advanced query design, performance tuning, and complex joins across large datasets
- Dimensional modeling (Kimball methodology) for analytics/reporting warehouses
- Experience with modern data lake house architecture
- Cloud data platforms: GCP (Big Query, Cloud Storage, Cloud Spanner) or equivalent AWS/Azure experience with willingness to work in GCP
- CDC/event‑driven data modeling expertise, including designing schemas for change data capture pipelines and streaming data
- Strong understanding of data governance, data quality, and data lineage
- Experience with Blue Yonder (JDA/Red Prairie) WMS data structures and Oracle transactional schemas
- Hands‑on experience with Apache Iceberg table design (partitioning, sort orders, schema evolution, Polaris catalog)
- MDM (Master Data Management) modeling experience, including hierarchical entity governance (org/site/system/region)
- Experience designing JSON/semi‑structured data models and configurable transformation schemas
- Prior work in multi‑tenant or multi‑site data architectures (normalizing data across large‑scale operational deployments with different configurations)
- Familiarity with ML feature engineering and feature store design patterns
- Knowledge of data catalog and metadata management tools
Qualifications:
- Experience with Google Cloud Spanner data modeling (wide‑column/relational hybrid)
- Understanding of GenAI/RAG data requirements (how LLMs consume structured operational data)
- Exposure to Oracle Golden Gate or Five Tran CDC source schemas
- Experience with Erwin, dbt, or similar data modeling/transformation tools
- Design the canonical data model that normalizes data from multiple WMS systems (i.e., Blue Yonder, Manhattan) into a unified domain layer
- Model core warehouse entities, including inventory transactions, orders, shipments, ASNs, locations, SKUs, wave management, dock operations, labor events, and returns/RMA
- Define Master Data Management (MDM) data models, including client, site, system, division, region hierarchies with governance rules
- Design table schemas, including partitioning strategies, sort orders, schema evolution plans, and…
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search: