×
Register Here to Apply for Jobs or Post Jobs. X

Data Scientist

Job in Toronto, Ontario, C6A, Canada
Listing for: CGI
Full Time position
Listed on 2026-02-15
Job specializations:
  • IT/Tech
    Data Engineer, AI Engineer
Salary/Wage Range or Industry Benchmark: 80000 - 130000 CAD Yearly CAD 80000.00 130000.00 YEAR
Job Description & How to Apply Below
Position Description
This role is hybrid and requires you to be at our Client office at a minimum 4 days per week - subject to change at any time.

We're looking for an AI/ML Engineer with hands‑on experience building and deploying production‑grade models, including Generative AI solutions. You will design, train, evaluate, and operationalize models on modern cloud data platforms, implement robust MLOps/LLMOps practices, and collaborate with data, platform, and product teams to drive end‑to‑end delivery.

Your future duties and responsibilities

Model Development:
Design, train, fine‑tune, and evaluate ML and GenAI models (supervised/unsupervised, NLP, CV, and LLM‑based use cases).

Model Deployment:
Package and deploy models to production using containers and CI/CD; implement scalable serving with REST/gRPC, batch, and streaming pipelines.

MLOps/LLMOps:
Establish automated training, evaluation, model registry, feature store integration, monitoring (data drift, model drift, latency, cost), and safe rollback.

GenAI Engineering:
Build prompts, retrieval pipelines (RAG), and model adapters/LoRA; evaluate with quantitative metrics and human‑in‑the‑loop reviews.

Data Engineering

Collaboration:

Ingest, transform, and validate datasets; partner with data engineering on schema design, data contracts, and lineage.

Cloud & Platforms:
Operate on AWS and/or Snowflake for storage, compute, orchestration, and governance; optimize cost/performance.

Observability & Reliability:
Instrument models and data pipelines with logging, tracing, metrics, and alerting; ensure SLAs/SLOs for availability and latency.

Documentation & Compliance:
Produce clear design docs, model cards, and runbooks; adhere to security, privacy, and responsible AI guidelines.

Required Qualifications To Be Successful In This Role

AI/ML:
Proficiency in Python and common ML stacks; strong with Tensor Flow and/or PyTorch for training, fine‑tuning, and inference.

Generative AI:

Experience with LLMs or diffusion models; prompt engineering, RAG, evaluation frameworks, and safety/guardrail techniques.

MLOps/LLMOps:
Hands‑on with CI/CD for ML (e.g., Git Hub Actions/Git Lab CI), model packaging (Docker), model registries, feature stores, and monitoring.

Cloud Data Platforms:
Practical experience with AWS (e.g., S3, ECR, ECS/EKS, Sage Maker, Lambda, Step Functions) and Snowflake (Snowpark, warehouses, governance).

Data Pipelines:
Building and operating ETL/ELT; familiarity with orchestration (Airflow/Prefect); schema management and data quality checks.

Collaboration:

Ability to work with cross‑functional teams (Data Eng, Platform, Product, Security) and communicate trade‑offs clearly.

Nice To Have (Preferred)

Streaming:
Kafka for real‑time features, streaming inference, and event‑driven retraining (optional as per your note).

Vector/RAG:

Experience with vector databases (e.g., FAISS, Milvus, pgvector) and chunking/indexing strategies.

Infrastructure as Code:
Terraform/Cloud Formation;
Kubernetes (EKS) for model serving and autoscaling.

Experiment Tracking: MLflow/Weights & Biases/Comet for experiment lineage and governance.

Testing:
Unit/integration tests for data/model pipelines; regression/benchmark suites for models.

Security & Compliance:
Secrets management, IAM, PII handling, and Responsible AI practices.

Snowflake Advanced:
Snowpark ML, external functions, UDFs for in‑database ML.

Tools & Technologies (Representative)

Languages/Frameworks:
Python, Tensor Flow, PyTorch, scikit‑learn, Transformers

MLOps/LLMOps: MLflow, Weights & Biases, Docker, K8s, model registries, feature stores

Cloud/Data: AWS (S3, ECR/ECS/EKS, Sage Maker), Snowflake (Snowpark), SQL

Pipelines & Orchestration:
Airflow/Prefect, dbt (optional), REST/gRPC endpoints

Streaming (Optional):
Kafka/Kafka Connect/KSQL

Observability:
Prometheus/Grafana, Open Telemetry, Cloud Watch

What You'll Deliver (Outcomes)

Productionized ML/GenAI services with defined SLAs/SLOs

Automated training and deployment pipelines with traceable experiment lineage

Reliable data and model monitoring (quality, drift, performance, cost)

Clear documentation (architecture, model cards, runbooks) and knowledge transfer

CGI is providing a reasonable estimate of the pay range for this role. The determination of this range includes factors such as skill set level, geographic market, experience and training, and licenses and certifications. Compensation decisions depend on the facts and circumstances of each case. A reasonable estimate of the current range is $80,000–$130,000. This role is an existing vacancy.

#J-18808-Ljbffr
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary