×
Register Here to Apply for Jobs or Post Jobs. X

Intermediate Data Engineer | GCP | Hybrid

Job in Toronto, Ontario, M5A, Canada
Listing for: Randstad Canada
Full Time position
Listed on 2025-12-27
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing, Big Data, Data Science Manager
Job Description & How to Apply Below
Position: Intermediate Data Engineer | GCP | Hybrid | 2-5 yrs
The Intermediate Data Engineer in the Wealth Data Engineering team, part of a global engineering organization, will be a key player in designing and implementing critical data solutions to meet the operational data needs of a large Wealth Management business. This role requires hands-on expertise in Big Data technologies, specifically Hadoop (Cloudera) and modern cloud data services like Google Cloud Platform (GCP), working collaboratively with enterprise data teams and solution architects to deliver end-to-end data solutions.

If you're interested, please apply this job posting with you updated resume or send your updated resume to , thank you.

Advantages
Compensation & Benefits:
Competitive rewards program, including a bonus, flexible vacation, personal days, sick days, and comprehensive benefits that start on day one.
Culture: A commitment to Diversity, Equity, Inclusion & Allyship, with a focus on creating an inclusive and accessible environment for all employees.
Professional Development:
Opportunities for upskilling through online courses, cross-functional development, and tuition assistance.
Work Environment: A dynamic ecosystem with opportunities for team collaboration.

Responsibilities
Data Pipeline Development:
Lead the development efforts for ingesting and transforming data from diverse sources. This includes hands-on coding, scripting, and specification writing to ensure end-to-end delivery of data into the Enterprise Data Lake environment.
Scalable Architecture:
Design, build, and operationalize distributed, reliable, and scalable data pipelines to ingest and process data from multiple sources.
Cloud Platform Expertise (GCP):
Utilize Google Cloud Platform (GCP) data services such as Data Proc, Dataflow, Cloud

SQL, Big Query, and Cloud Spanner, combined with technologies like Spark, Apache Beam/Composer, DBT, Confluent Kafka, and Cloud Functions.
Ingestion Patterns:
Design and implement versatile data ingestion patterns that support batch, streaming, and API interfaces for both data ingress and egress.
Technical Leadership:
Guide a team of data engineers, developing custom code and frameworks using best practices (Java, Python, Scala, Big Query, DBT, SQL) to meet demanding performance requirements.
Workflow Management:
Build and manage data pipelines with a deep understanding of workflow orchestration, task scheduling, and dependency management.
Technical Guidance:
Provide end-to-end technical expertise on effectively using cloud infrastructure to build solutions, creatively applying platform services to solve business problems, and communicating these approaches to various stakeholders.
Operational Excellence:
Provide guidance on implementing application logging, notification, job monitoring, and performance monitoring.

Qualifications

Experience:

2+ years of experience in data engineering, including performance optimization for large OLTP applications.
Big Data:
Strong knowledge of Hadoop concepts, including HDFS, Hive, Pig, Flume, and Sqoop.

Working experience in HQL.
Cloud Data Services (GCP):
Knowledge of primary managed data services within GCP, including Data Proc, Dataflow (Java/Python for streaming/batch jobs), Big Query/DBT, Cloud Spanner, and Cloud Pub/Sub.
Databases & Streaming:
Knowledge of Google Cloud Platform Databases (SQL, Spanner, Postgre

SQL), relational/No

SQL databases, and data streaming technologies such as Kafka and Spark-streaming.
Software Development:
Knowledge of Java microservices and Spring Boot. Working knowledge of developing and scaling JAVA REST services using frameworks like Spring.
Architecture & Operations:
Strong architecture knowledge with experience in providing technical solutions for cloud infrastructure. Knowledge of Infrastructure as Code (IaC) practices and frameworks like Terraform.

Soft Skills:

Good communication and problem-solving skills with the ability to effectively convey ideas to business and technical teams.
Nice-To-Have:
Understanding of the Wealth business line and the data domains required for building end-to-end solutions.

Summary
The Intermediate Data Engineer in the Wealth Data Engineering team, part of a global engineering organization, will be a key player in designing and implementing critical data solutions to meet the operational data needs of a large Wealth Management business. This role requires hands-on expertise in Big Data technologies, specifically Hadoop (Cloudera) and modern cloud data services like Google Cloud Platform (GCP), working collaboratively with enterprise data teams and solution architects to deliver end-to-end data solutions.

If you're interested, please apply this job posting with you updated resume or send your updated resume to , thank you.

Randstad Canada is committed to fostering a workforce reflective of all peoples of Canada. As a result, we are committed to developing and implementing strategies to increase the equity, diversity and inclusion within the workplace by examining our internal policies, practices, and systems throughout…
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary