×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer

Job in Toronto, Ontario, C6A, Canada
Listing for: TEEMA
Full Time position
Listed on 2026-03-04
Job specializations:
  • IT/Tech
    Data Engineer, Big Data, Cloud Computing, Data Analyst
Salary/Wage Range or Industry Benchmark: 60000 - 80000 CAD Yearly CAD 60000.00 80000.00 YEAR
Job Description & How to Apply Below

Senior Data Engineer

Salary: $ – $

Division & Section: Technology Services, Office of the Chief Technology Officer

Job Type & Duration: Full-time, 1 Permanent Vacancy

Shift Information: Monday to Friday, 35 hours per week

Affiliation: Non-Union

Why Join the City of Toronto

As a Senior Data Engineer at the City of Toronto, you will have the opportunity to work on cutting‑edge data solutions that directly impact the lives of Toronto's residents. You'll be part of a team driving the city's digital transformation, working on projects that enhance city services and operations through innovative data utilization. You'll work in a collaborative environment that values your expertise and provides opportunities for professional growth.

If you're passionate about leveraging data and AWS technologies to create meaningful change, we encourage you to apply and be part of our mission to build a smarter, more connected Toronto.

Major Responsibilities

Reporting to the Manager, Data Integration & Access, the Senior Data Engineer will join our Enterprise Data Platform team, being a vital partner in supporting the design, development, and implementation of our Enterprise Data Platform.

  • AWS Expertise: Utilize a wide range of AWS services to build and maintain scalable, secure, and efficient data infrastructure. Key services include S3, Redshift, Kinesis, EMR, Glue, Data Zone, Lake Formation, and Cloud Formation.
  • Data Pipeline Development: Design, implement, and maintain robust ETL/ELT processes using tools such as AWS Glue, DBT (Data Build Tool), and Apache Spark.
  • Data Mesh Implementation: Contribute to the implementation of a data mesh architecture, enabling decentralized, domain‑oriented data ownership and management.
  • Infrastructure as Code: Develop and maintain infrastructure as code using Terraform or AWS Cloud Formation to automate and streamline the deployment of cloud resources.
  • Data Processing: Utilize Python and Apache Spark for large‑scale data processing, transformation, and analysis.
  • Data Modeling: Design and implement efficient data models to support analytics, machine learning, and reporting needs.
  • Streaming Solutions: Develop and maintain both batch and real‑time data streaming solutions using technologies such as AWS Kinesis.
  • Data Governance: Implement and adhere to data governance policies to ensure data quality, privacy, and compliance with regulations.
  • Platform Enhancement: Work with technologies such as Databricks and Snowflake to enhance the capabilities of the data platform.
  • Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and provide tailored solutions.
  • Documentation and Knowledge Sharing: Create and maintain comprehensive documentation for data processes, pipelines, and models. Share knowledge with team members and contribute to the team's overall growth.
What do you bring to the role
  • Post‑secondary education in Computer Science, Data Science, Information Technology or a related discipline (or an equivalent combination of education and experience).
  • Extensive experience in data engineering, with expertise in AWS technologies, particularly in data‑related services (e.g. S3, Redshift, Kinesis, EMR, Glue, etc.).
  • Experience in Python programming, for big data processing frameworks, such as Apache Spark.
  • Experience with Infrastructure as Code/IaC (e.g. Terraform or AWS Cloud Formation), and ETL/ELT processes and tools (e.g. AWS Glue and DBT).
  • Knowledge of data modeling concepts and techniques.
  • Knowledge of other cloud platforms (e.g. Azure, GCP, etc.) for multi‑cloud strategies.
  • Strong understanding in data governance principles and privacy regulations (e.g., DGPR, CCPA).
  • Experience with data mesh architecture concepts and implementation, and with CI/CD practices and tools will be considered an asset.
  • AWS certifications (e.g. AWS Certified Data Analytics – Specialty, AWS Certified Big Data Specialty) will be considered an asset.
  • Understanding of machine learning workflows and MLOps practices will be considered an asset.
  • Exceptional problem‑solving, communication, analytical skills, to be able to explain complex technical concepts to non‑technical…
  • Position Requirements
    10+ Years work experience
    Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
    To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
     
     
     
    Search for further Jobs Here:
    (Try combinations for better Results! Or enter less keywords for broader Results)
    Location
    Increase/decrease your Search Radius (miles)

    Job Posting Language
    Employment Category
    Education (minimum level)
    Filters
    Education Level
    Experience Level (years)
    Posted in last:
    Salary