×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Solution Architect

Job in Toronto, Ontario, C6A, Canada
Listing for: Fulfillment IQ
Full Time position
Listed on 2026-03-10
Job specializations:
  • IT/Tech
    Data Engineer, Data Warehousing
Salary/Wage Range or Industry Benchmark: 120 CAD Daily CAD 120.00 DAY
Job Description & How to Apply Below

General Information

Job Title: Senior Data Solution Architect

Location: Toronto (Remote/Hybrid)

Job Type: Full‑Time or Contract for 12+ months

Reporting Line: SVP, Architecture

Salary Range: $120‑$145 CAD per hour (negotiable)

About Fulfillment IQ (FIQ): Fulfillment IQ is a supply chain engineering and transformation company that helps brands, retailers, and 3PLs design, build, and scale high‑performance logistics operations. We work at the intersection of strategy, operations, and technology where we solve complex, real‑world problems across warehouse design, automation, order management, transportation, and end‑to‑end supply chain execution. Our teams combine deep domain expertise with strong technical capability, delivering outcomes through consulting, systems implementation, and proprietary platforms that accelerate time‑to‑value and reduce delivery risk.

If you enjoy working in complex environments, partnering closely with clients, and seeing your work make a tangible impact on how global commerce moves, this is the place where your skills and judgment truly come to life.

Role Overview

We are seeking an experienced Senior Data Solution Architect to design and implement a data architecture for a large‑scale warehouse intelligence platform on Google Cloud Platform (GCP). The ideal candidate will have a strong background in data architecture, data engineering, and a deep understanding of the supply chain and logistics domains. The role requires a hands‑on approach, with a focus on designing data pipelines, integrating multiple warehouse management systems, and implementing a real‑time streaming layer.

Must

Have
  • 8+ years of experience in data architecture or data engineering, with at least 3 years in a solution architect capacity
  • 3+ years of experience with Snowflake, including data engineering, data modeling, and data warehousing
  • Deep GCP experience, including Big Query, Dataflow, Pub/Sub, Cloud Storage, Cloud SQL, and Cloud Spanner
  • Hands‑on experience with Apache Iceberg and CDC expertise
  • Experience with streaming architecture, including Apache Flink, GCP Dataflow, or Apache Kafka Streams
  • SQL mastery and experience with Oracle databases
  • Strong understanding of the supply chain and logistics domains
  • Strong communication and collaboration skills
Preferred Qualifications
  • Experience with Apache Kafka, Blue Yonder, Mule Soft integration patterns, and multi‑tenant/multi‑site data architectures
  • Familiarity with GenAI/LLM architectures and their data requirements
  • Experience with MDM tools or patterns
  • GCP Professional Data Engineer or equivalent certification
Nice‑to‑Have Qualifications
  • Familiarity with Google Cloud Spanner (DaaS)
  • Experience with Polaris catalog for Iceberg table management
Key Responsibilities
  • Design and implement the end‑to‑end data architecture for a multi‑site warehouse intelligence platform on GCP
  • Develop a dual‑layer data strategy, including analytics and real‑time operational data layers
  • Design and implement CDC pipelines using Five Tran, Debezium, or Oracle Golden Gate
  • Develop the real‑time operational data layer using Apache Flink or GCP Dataflow
  • Design integration patterns between the platform, Blue Yonder WMS, Mule Soft middleware, and downstream analytics consumers
  • Develop data pipelines built to scale across 50+ site production volumes
  • Collaborate with the BI team to configure Polaris catalog and Iceberg table partitioning strategy
  • Establish data quality, lineage, and observability standards across all pipelines
  • Participate in architecture reviews and provide technical leadership on data‑related decisions
What Success Looks Like in the First 90 Days

By Day 30

  • Gain a deep understanding of the warehouse intelligence platform vision, existing client environments, and data ecosystem across GCP, Snowflake, Iceberg, and streaming components
  • Establish strong working relationships with Architecture, BI, Platform Engineering, and client stakeholders
  • Review and validate current data flows, WMS integrations, CDC requirements, and multi‑site data ingestion patterns

By Day 60

  • Deliver a high‑level data architecture blueprint covering analytics, operational real‑time layers, and integration patterns
  • Prototype key…
Position Requirements
10+ Years work experience
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary