×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer to build and maintain data pipelines Databricks Lakehouse Platform financial services

Job in Toronto, Ontario, M5A, Canada
Listing for: S.i. Systems
Full Time position
Listed on 2026-02-16
Job specializations:
  • Software Development
    Data Engineer, Data Science Manager
Job Description & How to Apply Below
Position: Data Engineer to build and maintain data pipelines on the Databricks Lakehouse Platform for the financial services

We are looking for 2 Senior Data Engineers and 1 Data Engineer to build and maintain data pipelines on the Databricks Lakehouse Platform for the financial services industry

Full Time Permanent Role

3 times a week on site (Near King Station)

Salary Range: $,-$,/year

Must Haves

Skills:

  • Bachelor’s degree in computer science, Engineering, or related field
  • 5-7 years of experience in data engineering
  • 3+ years of hands-on experience with Databricks platform
  • Strong programming skills in Python
  • Experience with Spark and distributed computing
  • Working knowledge of AWS services (S3, Glue, Lambda)
  • Experience with Delta Lake and Lakehouse architecture
  • Familiarity with data modelling and SQL
  • Understanding of ETL/ELT principles and patterns
  • Experience with version control systems (
    Git
    )
  • Experience with CI/CD for data pipelines
  • Familiarity with Agile development methodologies
  • Experience with real-time data processing is a plus
  • Nice to Haves:

  • AI/ML
  • Job Duties in Brief:

  • Participate and contribute on all team activities such as Sprint Planning, Sprint Execution, Daily Scrum
  • Develop and maintain data pipelines using Databricks Lakehouse and Delta Lake
  • Implement ETL/ELT workflows using Spark (Python) in Databricks environment
  • Work with AWS services (S3, Glue) for data lake storage and catalog management
  • Create and optimize Spark jobs for efficient data processing and cost management
  • Build and maintain data quality checks and monitoring systems
  • Configure and manage Databricks notebooks and jobs
  • Implement proper security and access controls using Unity Catalog
  • Participate in code reviews and documentation efforts
  • Stay current with Databricks features and data engineering best practices
  • Support real-time data processing using structured streaming when required
  • Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
    To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
     
     
     
    Search for further Jobs Here:
    (Try combinations for better Results! Or enter less keywords for broader Results)
    Location
    Increase/decrease your Search Radius (miles)

    Job Posting Language
    Employment Category
    Education (minimum level)
    Filters
    Education Level
    Experience Level (years)
    Posted in last:
    Salary