General Information:
Job Title: Data Modeler
Location: Toronto (Remote/ Hybrid)
Job Type: Contract for 12+ months
Reporting Line: SVP, Architecture
Salary Range: $95–$115 CAD per hour (negotiable)
About Fulfillment IQ (FIQ):
Fulfillment IQ is a supply chain engineering and transformation company that helps brands, retailers, and 3PLs design, build, and scale high-performance logistics operations.
We work at the intersection of strategy, operations, and technology where we solve complex, real-world problems across warehouse design, automation, order management, transportation, and end-to-end supply chain execution.
Our teams combine deep domain expertise with strong technical capability, delivering outcomes through consulting, systems implementation, and proprietary platforms that accelerate time-to-value and reduce delivery risk.
If you enjoy working in complex environments, partnering closely with clients, and seeing your work make a tangible impact on how global commerce moves, this is the place where your skills and judgment truly come to life.
Role Overview:
We are seeking an experienced Data Modeler to design and implement the data models powering a multi- site warehouse intelligence platform on Google Cloud Platform (GCP). The ideal candidate will have a strong background in data modeling, dimensional modeling, and a deep understanding of the supply chain and logistics domains. The role requires a hands-on approach, with a focus on designing data
models that bridge multiple warehouse management systems, data lake house architectures, and real- time operational data stores.
Must Have:
- 6+ years of experience in data modeling roles, including logical, physical, dimensional, and domain modeling
- 3+ years of experience with Snowflake, including data engineering, data modeling, and data warehousing
- Supply chain/logistics/warehousing domain knowledge, including warehouse data: inventory lifecycle, order management, WMS transactions, shipping/receiving, labor tracking
- SQL expertise, including advanced query design, performance tuning, and complex joins across large datasets
- Dimensional modeling (Kimball methodology) for analytics/reporting warehouses
- Experience with modern data lake house architecture
- Cloud data platforms: GCP (BigQuery, Cloud Storage, Cloud Spanner) or equivalent AWS/Azure experience with willingness to work in GCP
- CDC/event-driven data modeling expertise, including designing schemas for change data capture pipelines and streaming data
- Strong understanding of data governance, data quality, and data lineage
Preferred Qualifications:
- Experience with Blue Yonder (JDA/RedPrairie) WMS data structures and Oracle transactional schemas
- Hands-on experience with Apache Iceberg table design (partitioning, sort orders, schema evolution, Polaris catalog)
- MDM (Master Data Management) modeling experience, including hierarchical entity governance (org/site/system/region)
- Experience designing JSON/semi-structured data models and configurable transformation schemas
- Prior work in multi-tenant or multi-site data architectures (normalizing data across large-scale operational deployments with different configurations)
- Familiarity with ML feature engineering and feature store design patterns
- Knowledge of data catalog and metadata management tools
Nice-to-Have Qualifications:
- Experience with Google Cloud Spanner data modeling (wide-column/relational hybrid)
- Understanding of GenAI/RAG data requirements (how LLMs consume structured operational data)
- Exposure to Oracle GoldenGate or FiveTran CDC source schemas
- Experience with Erwin, dbt, or similar data modeling/transformation tools
Key Responsibilities:
- Design the canonical data model that normalizes data from multiple WMS systems (i.e., Blue Yonder, Manhattan) into a unified domain layer
- Model core warehouse entities, including inventory transactions, orders, shipments, ASNs, locations, SKUs, wave management, dock operations, labor events, and returns/RMA
- Define Master Data Management (MDM) data models, including client, site, system, division, region hierarchies with governance rules
- Design table schemas, including partitioning strategies, sort orders, schema evolution plans, and compaction policies
- Create the bronze/silver/gold data layer architecture, including raw CDC events from WMS, cleansed and deduplicated domain tables, and aggregated business-ready datasets for reporting and ML Models
- Optimize table designs for analytics queries (BigQuery) and streaming updates Build dimensional models (star/snowflake schemas) for embedded analytics and reporting, including warehouse operations, client-facing dashboards, and financial reporting
- Design reporting models to replace a legacy Oracle-based analytics platform across a large multi-site deployment
- Work with the BI team on Polaris catalog metadata standards and data dictionary
- Design data schemas for the near real-time operational data layer, including inventory position, scan and validate events, exception/error state tracking, and wave progress and dock utilization
- Model the data structures that feed GenAI operational assistants
- Design event schemas for Kafka topics (CDC events, operational events, alert events)
- Design the data models that feed Warehouse AI modules, including forecasting models, slotting/replenishment models, and pick optimization models
- Define the feature store schema for ML model training and inference pipelines
- Ensure ML data models align with the 5-minute Iceberg refresh cadence and real-time streaming layer
What Success Looks Like in the First 90 Days:
30 Days
· Understand existing WMS data sources, GCP/Snowflake architecture, and current data flows.
· Align with architecture and engineering teams; document core warehouse domains.
· Identify data quality gaps and normalization challenges.
60 Days
· Deliver first draft of the canonical data model and bronze/silver schemas.
· Begin implementing models in Snowflake/BigQuery/Iceberg.
· Define initial MDM structures and dimensional models for key analytics use cases.
· Draft Kafka event schemas and establish metadata documentation.
90 Days
· Deliver production-ready canonical model for at least one WMS and site.
· Finalize end‑to‑end bronze → silver → gold architecture with optimized performance.
· Publish full data dictionary and lineage.
· Deploy dimensional models for reporting and ML/AI data structures.
· Improve data quality and consistency across multi‑site ingestion.
Key Performance Indicators (KPIs):
· Canonical Model Completion & Adoption across sites/WMS sources.
· Data Quality Improvements (dedupe, consistency, completeness).
· Dimensional Model Delivery for analytics and reporting.
· Pipeline Readiness across bronze/silver/gold layers.
· MDM Model Implementation across clients/sites.
· Documentation & Metadata Accuracy (data dictionary, lineage).
· Query Performance Optimization in Snowflake/BigQuery.
· Enablement of ML/AI & Feature Store Pipelines.
Why You’ll Love Working Here:
At Fulfillment IQ, we don’t just build supply chain solutions; we build long-term careers.
We believe in giving our people real ownership, real influence, and meaningful responsibility. Whether you’re leading a client relationship, shaping a solution, or building internal capability, your decisions have visible impact.
We operate with a high-trust, high-standards culture that values clear thinking, collaboration, and accountability without unnecessary hierarchy or bureaucracy.
Here’s what makes Fulfillment IQ a rewarding place to work:
Work That Matters
You’ll help solve real operational challenges that directly impact global commerce, customer experience, and supply chain performance.
Career Growth That Matters
We invest in mentorship, leadership development, and continuous learning — supporting both vertical growth and long-term career progression.
Flexibility to Thrive
We support remote and hybrid work models, flexible schedules, and trust our teams to manage their time responsibly.
We Celebrate Impact
From project wins to personal milestones, we recognize contribution, celebrate progress, and value consistency over heroics.
A Collaborative Culture
You’ll work alongside experienced consultants, engineers, product leaders, and commercial professionals who value ownership, transparency, and high standards.
Perks you’ll appreciate:
Health & Wellness
· Comprehensive health and dental coverage for you and your family (region-specific plans)
· Employee wellness programs, where applicable
Time Off
· Competitive paid time off (PTO), sick leave, and public holidays
· Flexible leave policies that respect local labor standards
Retirement & Financial Security
· Retirement savings programs and employer contributions
· Region-specific plans (401(k) in the U.S., CPP and supplementary plans in Canada)
Professional Growth
· Dedicated learning and development budget
· Support for skills development, leadership growth, and career progression
Flexible Work
· Remote and hybrid work options
· Flexible working hours aligned to role and client needs
Additional Perks
· Equipment and workstation allowances
· Internet and business travel reimbursements
· Employee stock options (ESOP), where applicable
· Team events, meetups, and company offsites
Life at Fulfillment IQ:
Fulfillment IQ is a people-first company built in trust, collaboration, and ownership. We are proud to be an equal opportunity employer and are committed to building a diverse, inclusive, and high-performing workplace.
Learn More About Us:
Website: fulfillmentiq.com
LinkedIn: Fulfillment IQ
Spotify: eCom Logistics Podcast Spotify
YouTube: eCom Logistics Podcast YouTube
Top Skills
Fulfillment IQ Toronto, Ontario, CAN Office
Toronto, Ontario, Canada


