Job Overview
As a Data Architect you will be responsible for leading the Azure architecture, design and delivery of data models and data products which enable innovative, customer-centric digital experiences. You will be working as part of a cross-discipline agile team who helps each other solve problems across all business areas. You will be a thought leader and subject matter expert on data lake & data warehousing and modeling activities for the team and use your influence to ensure that the team produces best-in class data solutions that leverage repeatable, maintainable, and well-documented design patterns.
You will employ best practice in development, security, accessibility and design to achieve the highest quality of service for our customers.
Resume Due Date
:
Monday, April 14, 2025 (5:00PM EST)
Job : 25-053
Number of Vacancies
: 1
Level
:
Senior
Duration
: 12 Months
Hours of work
: 35 Hours per week
Location
: 889 Brock Road, Pickering
Work Mode
:
Hybrid – 4 days remote
- Lead the architecture, design and oversee implementation of modular and scalable data ELT/ETL pipelines and data infrastructure on Azure and Databricks leveraging the wide range of data sources across the organization
- Design curated common data models that offer an integrated, business-centric single source of truth for business intelligence, reporting, and downstream system use
- Work closely with infrastructure and cyber teams to ensure data is secure in transit and at rest
- Create, guide and enforce code templates for delivery of data pipelines and transformations for structured, semi-structured and unstructured data sets
- Develop modeling guidelines that ensure model extensibility and reuse by employing industry standard disciplines for building facts, dimensions, bridge, aggregates, slowly changing dimensions and other dimensional and fact optimizations
- Establish standards database system fields, including primary and natural key combinations that optimize join performance in a multi-domain, multiple subject area physical (structured zone) and semantic model (curated zone)
- Ensure model extensibility by employing industry standard disciplines for building facts, dimensions, bridge, aggregates, slowly changing dimensions and other dimensional and fact optimizations
- Transform data and map to more valuable and understandable semantic layer sets for consumption, transitioning from system centric language to business-centric language
- Collaborate with business analysts, data scientists, data engineers, data analysts and solution architects to develop data pipelines to feed our data marketplace
- Introduce new technologies to the environment through research and POCs. and prepare POC code designs that can be implemented and productionized by developers
- Work with tools in the Microsoft Stack;
Azure Data Factory, Azure Data Lake, Azure SQL Databases, Azure Data Warehouse, Azure Synapse Analytics Services, Azure Databricks, Microsoft Purview, and Power BI - Work within the agile SCRUM work management framework in delivery of products and services, including contributing to feature & user story backlog item development, and utilizing related Kanban/SCRUM toolsets
- Document as-built architecture and designs within the product description
- Design data solutions that enable batch, near-real-time, event-driven, and/or streaming approaches depending on business requirements
- Design & advise on orchestration of data pipeline execution to ensure data products meet customer latency expectations, dependencies are managed, and datasets are as up-to-date as possible, with minimal disruption to end-customer use
- Ensure that designs are implemented with proper attention to data security, access management. and data cataloging requirements
- Approve pull requests related to production deployments
- Demonstrate solutions to business customers to ensure customer acceptance and solicit feedback to drive iterative improvements
- Assist in troubleshooting issues for datasets produced by the team (Tier 3 support), on an as-required basis
- Guide data modelers, business analysts and data scientists in the build of models optimized for KPI delivery, actionable feedback/writeback to operational systems and enhancing the predictability of machine learning models and experiments
- Develop Bicep or Terraform templates to manage Azure Infra as code
- Perform hands on data engineering work to build data ingestion and data transformation pipelines
EDUCATION
- Requires an extensive knowledge in designing a data model to solve a business problem, specifying a data pipeline design pattern to bring data into a data warehouse, optimizing data structures to achieve required performance, designing low-latency and/or event-driven patterns of data processing, creation of a common data model to support current and future business needs.
- This knowledge is considered to be normally acquired through the completion of a four-year University education in computer science. computer/software engineering or other relevant…
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search: