×
Register Here to Apply for Jobs or Post Jobs. X

Databricks Senior Engineer

Job in Toronto, Ontario, M5A, Canada
Listing for: Slalom
Full Time position
Listed on 2026-01-01
Job specializations:
  • IT/Tech
    Data Engineer, Data Science Manager
Job Description & How to Apply Below

Slalom is a fiercely human business and technology consulting company that leads with outcomes to bring more value, in all ways, always. From strategy through delivery, our agile teams across 52 offices in 12 countries collaborate with clients to bring powerful customer experiences, innovative ways of working, and new products and services to life. We are trusted by leaders across the Global 1000, many successful enterprise and mid-market companies, and 500+ public sector organizations to improve operations, drive growth, and create value.

At Slalom, we believe that together, we can move faster, dream bigger, and build better tomorrows for all.

Job Title:
Databricks Senior Engineer (For Vacant Role)

So, what will I do?

  • Be a technical leader in teams with minimal oversight and direction to deliver innovative solutions on the Databricks Platform using core cloud data lakehouse methodologies, tools, distributed processing engines, event streaming platforms, and other modern data related technologies.
  • Be part of the Databricks Center of Excellence.
  • Build the next generation of data platforms and work with some of the most forward-thinking organizations in data and analytics.
  • Work under the direction of a Solution Architect to help design and implement components of our clients’ data platform solution.
  • Participate in design sessions and be able to break down complex development tasks, and complete development items on time.
  • Contribute to various COE initiatives to develop Databricks solution accelerators and bring an innovative mindset.
  • And, what will I bring?

  • As a Senior Engineer in the Databricks practice, you will bring a curious mindset and a passion for exploring innovative solutions to address our clients’ most pressing data challenges. You are a self-starter who excels at breaking down complex problems and eagerly shares insights with your team and the broader Builder community.
  • Key Responsibilities:
  • Develop and implement data solutions using Databricks, with hands-on experience in specific Databricks platform features such as Delta Lake, Uniform (Iceberg), Delta Live Tables (Lakeflow Declarative Pipelines), and Unity Catalog.
  • Collaborate with cross-functional teams to design and optimize data pipelines for both batch and streaming data, ensuring data quality and efficiency.
  • Stay updated with emerging technologies, latest Databricks platform features and continuously improve your skills in Databricks and other relevant tools.
  • Requirements:

  • 5+ years of data engineering experience, with at least 2 years of hands-on data pipeline design and development experience with Databricks, including specific
    -platform features like Delta Lake, Uniform (Iceberg), Delta Live Tables (Lakeflow Declarative pipelines), and Unity Catalog.
  • Proficiency in designing and building robust, scalable, YAML configuration driven data pipelines with batch, micro-batch, and streaming data ingestion and processing patterns using tools like Auto Loader demonstrating a strong understanding of modern data engineering practices.

    Experienced in building complex job/workflow orchestration patterns using either Databricks jobs/workflows or with other orchestration tools like Airflow, dbt etc.
  • Exposure in building robust data quality, pipeline audit/observability solutions with Databricks native features and/or other data quality tools/frameworks like Great Expectations, Collibra, dbt etc.
  • Proficiency in Big Data Platforms:
    Apache Spark, Presto, Amazon EMR.
  • Experience with Cloud Data Warehouses:
    Amazon Redshift, Snowflake, Google Big Query.
  • Strongprogramming skills using SQL, Stored Proceduresand Object-Oriented Programming languages (like Java, Python, PySpark etc.).
  • Familiarity with building DevOPs, CI/CD pipelines using Databricks Asset Bundles with automated validation and testing.
  • Exposure with Infrastructure as Code (IaC) tools like Terraform is a big plus.
  • Familiarity with No

    SQL Databases and Container Management Systems.
  • Exposure to AI/ML tools (like mlflow), prompt engineering,, and modern data and AI agentic workflows.
  • An ideal candidate will have Databricks Data Engineering Associate and/or Professional ceritification completed with multiple…
  • Position Requirements
    10+ Years work experience
    Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
    To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
     
     
     
    Search for further Jobs Here:
    (Try combinations for better Results! Or enter less keywords for broader Results)
    Location
    Increase/decrease your Search Radius (miles)

    Job Posting Language
    Employment Category
    Education (minimum level)
    Filters
    Education Level
    Experience Level (years)
    Posted in last:
    Salary