×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer

Job in Toronto, Ontario, M5A, Canada
Listing for: Rockstar
Full Time position
Listed on 2026-02-16
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Job Description & How to Apply Below

Rockstar is recruiting for a fast-growing, mission-driven technology company focused on workforce development. The client is dedicated to building innovative digital solutions that empower individuals and organizations to thrive in the modern economy. Rockstar is supporting this client in their search for a talented Sr. Data Engineer to help evolve their core data platform and drive impactful business outcomes.

A Sr. Data Engineer is sought to join the team. This individual will play a key role in evolving the core data platform, which includes data pipelines, machine learning models, and various databases. The ideal candidate will combine technical data expertise with strong business intuition to build a foundation of reliable data. Expertise in data engineering will help build strong, performant pipelines.

What You’ll Own

- Data infrastructure:
Building and maintaining the infrastructure that powers the data platform including pipeline orchestration, data warehousing, and machine learning

- Data solutions that drive the product:
Developing and maintaining data solutions alongside a team of data scientists that enable the product to function at scale and with quality

- Data governance and quality:
Upholding best practices in data governance, ensuring accuracy, accessibility, and compliance across data systems

- Cross-platform data sourcing:
Surfacing and integrating data from across the platform to address real business needs in product, engineering, and GTM

- Evolving core data models:
Continuously evolving foundational models by identifying and incorporating new, high-value data sources

30/60/90 Day Plan

30 days:

- Onboarding/Learning Stack/Product

- Learning who the customers are, what their problems are, and how data can be leveraged to support them

- Gaining an understanding of core data entities and how they drive the product

- Contributing to core data pipelines by adding data quality and data enrichment layers

60 days:

- Working with data scientists to develop datasets and processes that streamline complex workflows

- Contributing to and owning aspects of the data catalog by defining and maintaining metrics, dimensions, and lineage

- Supporting surrounding teams in getting value out of the platform’s data through regular reporting and analysis

90 days:

- Owning and automating reporting workflows from data ingestion all the way to building out dashboards and tools

- Independently gathering reporting and insights requirements from stakeholders

- Presenting findings to stakeholders and providing recommendations to drive the organization towards making data-driven decisions

Required Experience

- Proven ability to translate ambiguous business problems into clear, actionable insights

- Hands-on experience using SQL and Python for analysis in a professional setting

- Experience building and maintaining data pipelines, warehouses, and infrastructure

- Strong communication skills to convey technical insights to both technical and non-technical stakeholders

- Demonstrated ownership of analytics solutions, ensuring accuracy, reliability, and business alignment

- Familiarity with data visualization tools such as Looker, Power BI, or Tableau

- Familiarity with modeling structured and unstructured data, including No

SQL databases like MongoDB

- A sharp, kind, and open-minded approach, driven by both excellence and impact

Preferred Experience

- Hands-on experience with modern data tools like DBT and Airflow

- Experience with Sage Maker or an equivalent machine learning / data science platform

- Experience in the workforce development industry

Our Tech Stack

- Languages:

SQL, Python

- Data orchestration and transformation:
Airflow, dbt

- Data storage and warehousing:
Postgre

SQL, Redshift, Mongo

DB (for unstructured data)

- Machine learning and experimentation: AWS Sage Maker

- Visualization and reporting:
Looker

- Infrastructure: AWS ecosystem (S3, Lambda, Glue, Redshift)

Position Requirements
10+ Years work experience
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary