×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer Senior Consultant

Job in Montreal, Montréal, Province de Québec, Canada
Listing for: NTT
Full Time position
Listed on 2025-12-31
Job specializations:
  • IT/Tech
    Data Engineer, Data Warehousing
Job Description & How to Apply Below
Location: Montreal

JOB DESCRIPTION

Req : 345525

We are currently seeking a Data Engineer Senior Consultant to join our team in Montreal, Quebec (CA-QC), Canada (CA).

Job Title:

Senior Cloud Data Warehouse Engineer

Location:

Montreal (day 1 onboarding onsite /in-office presence required 3x/week)

Job Description

Team Overview

The Controls Engineering, Measurement and Analytics (CEMA) department is responsible for Cyber Risk and Control assessment, management, monitoring, and reporting capabilities across Technology, resulting in risk reduction and better oversight of the technology risk landscape of the firm. Our work is always client focused, our engineers are problem-solvers and innovators. We seek exceptional technologists to help deliver solutions on our user-facing applications, data stores, and reporting and metric platforms while being cloud-centric, leveraging multi-tier architectures and aligned with our Dev Ops and Agile strategies.

We are in the process of modernizing our technology stack across multiple platforms with the goal of building scalable, front-to-back assessment, measurement and monitoring systems using the latest cloud, web, and data technologies. We are looking for someone with a systematic problem-solving approach, coupled with a sense of ownership and drive. The successful candidate will be able to influence and collaborate globally.

They should be a strong team-player, have an entrepreneurial approach, push innovative ideas while appropriately considering risk, and adapt in a fast-paced changing environment.

Role Summary

As a Senior Cloud Data Warehouse Engineer, you will be a member of the C3 Data Warehouse team with a focus on building our next-gen data platform used for sourcing and storing data from different technology systems across the firm into a centralized data platform that empowers various reporting and analytics solutions for the Technology Risk functions within Morgan Stanley. In this role you will be primarily responsible for contributing to the development of our Cloud Data Warehouse utilizing Snowflake and Python-based tooling.

You will also be responsible for the design and development of our data warehouse utilizing Snowflake capabilities such as data sharing, time travel, Snow Park, workload optimization across analytic and AI use-cases, and the ingestion and storage of structured and unstructured data. You will also work on the integration of our Snowflake data warehouse with existing internal platforms for data quality, data cataloging, data discovery, incident logging, and metric generation.

You will be working closely with data warehousing leads, data analysts, ETL developers, infrastructure engineers, and data analytics teams to facilitate the implementation of this data platform and data pipeline framework.

KEY RESPONSIBILITIES:

• To design, develop, and manage our Snowflake data warehouse.

• To contribute towards the establishment of best practices for the optimal and efficient usage of Snowflake with tooling like Airflow, DBT and Spark.

• To assist with the testing and deployment of our data pipeline framework utilizing standard testing frameworks and CI/CD tooling.

• To monitor the performance of queries and data loads and perform tuning as necessary.

• To provide assistance and guidance during the QA & UAT phases to quickly confirm the validity of potential issues and to determine the root cause and best resolution of verified issues.

SKILLS /

QUALIFICATIONS:

• Bachelor’s degree in Computer Science, Software Engineering, Information Technology, or related field required.

• At least 10 years of experience in data development and solutions in highly complex data environments with large data volumes.

• At least 7 years of SQL / PLSQL experience with the ability to write ad-hoc and complex queries to perform data analysis.

• At least 5 years of experience with developing data solutions on Snowflake.

• At least 3 years of data pipelines and data warehousing solutions using Python and libraries such as Pandas, Num Py, PySpark, etc.

• At least 3 years of experience developing solutions in a hybrid data environment (on-Prem and Cloud)

Hands on experience with Python a must.

• Hands…

Position Requirements
10+ Years work experience
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary