Snowflake Data Architect
Location Markham, ON
Long Term Contract
Working model Hybrid days Work From Office days mandatory Tue Wed and day flexible any day from Mon Thu Fri
Rate: DOE
Primary Skills
Architect and implement advanced data solutions using Snowflake on AWS ensuring scalable secure and highperformance data environments
Extensive experience years in data architecture and engineering with a proven track record in largescale data transformation programs ideally in insurance or financial services
Proven experience in architecting and implementing advanced data solutions using Snowflake on AWS
Expertise in design and orchestrating data acquisition pipelines using AWS Glue for ETLELT Snowflake Open Flow and Apache Airflow for workflow automation enabling seamless ingestion of different data from diverse sources
Deep expertise in Snowflake with handson experience delivering Snowflake as an enterprise capability
Handson experience with AWS Glue for ETLELT Apache Airflow for orchestration and dbt for transformation preferably deployed on AWS ECS
Proficiency in SQL data modelling ETLELT processes
Proven experience in DBT to manage and automate complex data transformations within Snowflake ensuring modular testable and versioncontrolled transformation logic
Experience in implementing the lake house solution Medallion architecture for financial or insurance carriers
Experience in optimizing and tune Snowflake environments for performance cost and scalability including query optimization and resource management
Experience in architectinglead migration of workloads from Cloudera to Snowflake
Experience in evaluating the data technology platform including data governance suites data security products
Develop robust data models and data pipelines to support data transformation integrating multiple data sources and ensuring data quality and integrity
Document architecture data flows and transformation logic to ensure transparency maintainability and knowledge sharing across teams
Strong knowledge of data lifecycle mgmt data retention data modelling and working knowledge of cloud computing and modern development practices
Secondary Skills
Familiarity with data mesh principles data product delivery and modern data warehousing paradigms
Experience in Designing Streamlit apps and define new capabilities and data products leveraging snowflake ML and MLOPS capabilities
Snow Pro advanced certification preferred
Knowledge of scripting languages Python Java
Experience with data governance metadata management and data quality frameworks eg Collibra Informatica
Experience in Insurance Domain
Experience in converting policydata conversion from legacy to modern platform
Exposure to enterprise Datawarehouse solution like Cloudera AWS Redshift and informatica tool sets IDMC powercenter BDM
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search: