×
Register Here to Apply for Jobs or Post Jobs. X

Sr. Data Engineer

Job in Toronto, Ontario, C6A, Canada
Listing for: FutureFit AI
Full Time position
Listed on 2026-02-13
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst, Data Science Manager, Data Warehousing
Salary/Wage Range or Industry Benchmark: 60000 - 80000 CAD Yearly CAD 60000.00 80000.00 YEAR
Job Description & How to Apply Below

Come join our Data team! High velocity, high intensity, high trust, high bar, high impact, and a will to win. If those words resonate deeply with you, this could be your next career move. We’re seeking someone who leads with humility, pursues audacious goals, and is motivated by meaningful impact on people and the world.

At Future Fit AI, our core mission is to help more people get to better jobs faster and cheaper, with a specific focus on those facing barriers to opportunity. Our work helps resolve the growing issue of economic inequality, ensuring that no one is left behind in the future of work. Our AI‑powered platform brings efficiency and insight to workforce development, replacing outdated systems and unlocking human potential at scale.

Your

Role

We’re seeking a Sr. Data Engineer to join our team. You will play a key role in evolving our core data platform which includes data pipelines, machine learning models, and various databases. Combine technical data expertise with strong business intuition to build a foundation of reliable data. Your expertise in data engineering will help us build strong, performant pipelines.

What You’ll Own
  • Build and maintain the infrastructure that powers our data platform including pipeline orchestration, data warehousing, and machine learning.
  • Develop and maintain data solutions alongside a team of data scientists that enable our product to function at scale and with quality.
  • Uphold best practices in data governance, ensuring accuracy, accessibility, and compliance across data systems.
  • Surface and integrate data from across the platform to address real business needs in product, engineering, and GTM.
  • Continuously evolve our foundational models by identifying and incorporating new, high‑value data sources.
30/60/90 Day Plan 30 days
  • Onboarding/Learning Stack/Product.
  • Learn who our customers are, what their problems are, and how we can leverage data to support them.
  • Gain an understanding of our core data entities and how they drive the product.
  • Contribute to our core data pipelines by adding data quality and data enrichment layers.
60 days
  • Work with data scientists to develop datasets and processes that streamline complex workflows.
  • Contribute to and own aspects of our data catalog by defining and maintaining metrics, dimensions, and lineage.
  • Support surrounding teams in getting value out of our platform’s data through regular reporting and analysis.
90 days
  • Own and automate reporting workflows from data ingestion all the way to building out dashboards and tools.
  • Independently gather reporting and insights requirements from stakeholders.
  • Present findings to stakeholders and provide recommendations to drive us towards making data‑driven decisions.
Required Experience
  • Proven ability to translate ambiguous business problems into clear, actionable insights.
  • Hands‑on experience using SQL and Python for analysis in a professional setting.
  • Experience building and maintaining data pipelines, warehouses, and infrastructure.
  • Strong communication skills to convey technical insights to both technical and non‑technical stakeholders.
  • Demonstrated ownership of analytics solutions, ensuring accuracy, reliability, and business alignment.
  • Familiarity with data visualization tools such as Looker, Power BI, or Tableau.
  • Familiarity with modeling structured and unstructured data, including No

    SQL databases like Mongo

    DB.
  • A sharp, kind, and open‑minded approach, driven by both excellence and impact.
Preferred Experience
  • Hands‑on experience with modern data tools like DBT and Airflow.
  • Experience with Sage Maker or an equivalent machine learning/data science platform.
  • Experience in the workforce development industry.
Our Tech Stack
  • Languages:

    SQL, Python.
  • Data orchestration and transformation:
    Airflow, dbt.
  • Data storage and warehousing:
    Postgre

    SQL, Redshift, Mongo

    DB (for unstructured data).
  • Machine learning and experimentation: AWS Sage Maker.
  • Visualization and reporting:
    Looker.
  • Infrastructure: AWS ecosystem (S3, Lambda, Glue, Redshift).
Your Education

Your alma mater isn’t our focus. Your grit, hunger, and drive are. If you learn continuously, tackle challenges head‑on, and know your strengths and gaps intimately—you’re…

Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary