×
Register Here to Apply for Jobs or Post Jobs. X

Snowflake Data Architect

Job in Markham, Ontario, Canada
Listing for: E-Solutions
Full Time position
Listed on 2026-01-02
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Job Description & How to Apply Below

Snowflake Data Architect
Location Markham, ON
Long Term Contract

Working model Hybrid days Work From Office days mandatory Tue Wed and day flexible any day from Mon Thu Fri

Rate: DOE

Primary Skills

Architect and implement advanced data solutions using Snowflake on AWS ensuring scalable secure and highperformance data environments

Extensive experience years in data architecture and engineering with a proven track record in largescale data transformation programs ideally in insurance or financial services

Proven experience in architecting and implementing advanced data solutions using Snowflake on AWS

Expertise in design and orchestrating data acquisition pipelines using AWS Glue for ETLELT Snowflake Open Flow and Apache Airflow for workflow automation enabling seamless ingestion of different data from diverse sources

Deep expertise in Snowflake with handson experience delivering Snowflake as an enterprise capability

Handson experience with AWS Glue for ETLELT Apache Airflow for orchestration and dbt for transformation preferably deployed on AWS ECS

Proficiency in SQL data modelling ETLELT processes

Proven experience in DBT to manage and automate complex data transformations within Snowflake ensuring modular testable and versioncontrolled transformation logic

Experience in implementing the lake house solution Medallion architecture for financial or insurance carriers

Experience in optimizing and tune Snowflake environments for performance cost and scalability including query optimization and resource management

Experience in architectinglead migration of workloads from Cloudera to Snowflake

Experience in evaluating the data technology platform including data governance suites data security products

Develop robust data models and data pipelines to support data transformation integrating multiple data sources and ensuring data quality and integrity

Document architecture data flows and transformation logic to ensure transparency maintainability and knowledge sharing across teams

Strong knowledge of data lifecycle mgmt data retention data modelling and working knowledge of cloud computing and modern development practices

Secondary Skills

Familiarity with data mesh principles data product delivery and modern data warehousing paradigms

Experience in Designing Streamlit apps and define new capabilities and data products leveraging snowflake ML and MLOPS capabilities

Snow Pro advanced certification preferred

Knowledge of scripting languages Python Java

Experience with data governance metadata management and data quality frameworks eg Collibra Informatica

Experience in Insurance Domain

Experience in converting policydata conversion from legacy to modern platform

Exposure to enterprise Datawarehouse solution like Cloudera AWS Redshift and informatica tool sets IDMC powercenter BDM

Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary