Job Description & How to Apply Below
Azure Databricks Data Architect Owl Sure, A Business Unit of Value Momentum
Seniority Level: Architecute
Industry: IT Services and IT Consulting
Job Function: Data Architecture | Cloud Engineering | Information Technology
About Owl Sure
Owl Sure enables Financial Services, Healthcare, and Life Insurance organizations to achieve value for their budgets, accelerate speed-to-market, and scale operations. Through thoughtfully designed solutions and highly automated managed services, we deliver reliable outcomes backed by deep technical expertise and a continuously optimized delivery platform.
Job Title:
Azure Databricks Data Architect
Role Summary
The Azure Databricks Data Architect is responsible for designing and implementing modern, cloud-native data platforms that support enterprise-scale analytics, BI, and data-driven decision-making.
This role leads end-to-end architecture for data modernization initiatives—from ingestion to consumption—leveraging Azure services and Databricks to build scalable, secure, and cost-effective solutions. The architect will also drive Data Ops and MLOps frameworks, CI/CD automation, and internal capability development.
Key Responsibilities
Data Architecture & Modernization
Design modern, scalable data architectures for cloud-native platforms.
Lead architecture and implementation of end-to-end data solutions on Microsoft Azure.
Define ingestion-to-consumption frameworks supporting:
Modern Data Warehouses
BI & Reporting
Advanced Analytics & Insights
Conduct data strategy sessions focused on scalability, flexibility, and performance optimization.
Design data models aligned with enterprise data governance standards.
Platform Engineering & Implementation
Build cost-efficient infrastructure using:
Azure Databricks
Azure Data Factory
Develop data pipelines using Databricks, PySpark, and modern ELT/ETL patterns.
Implement CI/CD pipelines for Databricks using Azure Dev Ops.
Lead migrations from on-premises platforms (Spark, Hadoop) to Databricks-based architectures.
Data Engineering & Processing
Implement advanced data processing frameworks (Spark, PySpark).
Design and optimize cloud-native databases and columnar storage architectures.
Work with processing substrates such as ETL tools, Kafka, and ELT techniques.
Enable Data Mesh and Data Product architecture where applicable.
Data Ops & MLOps
Build and operationalize Data Ops and MLOps frameworks.
Establish automation standards for testing, deployment, and monitoring.
Ensure governance, security, and compliance within data platforms.
Stakeholder & Team Collaboration
Collaborate with Data Engineering, Data Management, BI, and Analytics teams.
Partner with customers on enterprise data modernization initiatives.
Create training plans and learning materials to upskill internal associates.
Develop reusable, domain-specific accelerators and industry-focused data solutions.
Required Skills & Experience
Experience
10+ years of overall IT experience.
Minimum 4+ years implementing cloud-native end-to-end data solutions.
Proven experience delivering Modern Data Warehouse and analytics platforms.
Experience working in complex enterprise IT environments.
Technical Skills
Strong expertise in Azure-based modern data architecture.
Hands-on experience with:
Azure Databricks
Py Spark
Azure Data Factory
Experience migrating on-prem platforms (Spark, Hadoop) to cloud-native architectures.
Strong understanding of data warehouse concepts and columnar database design.
CI/CD implementation using Azure Dev Ops.
Experience with ETL tools, Kafka, and ELT frameworks.
Proficiency in data governance and cloud-native architecture patterns.
Methodology & Collaboration
Understanding of Agile/Scrum methodologies.
Ability to collaborate across cross-functional teams (Data Engineering, BI, Analytics).
Strong stakeholder communication and architecture documentation skills.
Preferred Qualifications
Experience designing and implementing Data Mesh or Data Product architectures.
Exposure to MLOps frameworks and advanced analytics platforms.
Domain experience in enterprise data modernization programs.
Why Join Owl Sure?
Lead enterprise-scale cloud data transformation programs.
Work with global clients across Insurance, Healthcare, and Financial Services.
Shape next-generation data platforms leveraging Azure and Databricks.
Contribute to a growing data and analytics practice with strong innovation focus.
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×