More jobs:
Job Description & How to Apply Below
Job Title:
Architect - Databricks (Cloud Data)
Primary skills: Databricks, PySpark, and architecture solutions.
Secondary skills: Unity Catalog, Star Burst, Azure Dev Ops, CICD, and orchestration layers.
Mode of work: Work from office
Experience:
13+ Years of Experience
Responsibilities:
· Lead migration from Native Spark to Databricks.
· Architect migration to Databricks.
· Build Data Governance solutions using tools like Unity Catalog and Star Burst.
· Develop orchestration workflows using Databricks and ADF.
· Implement CICD pipelines for Databricks in Azure Dev Ops.
· Process near-real-time data through Auto Loader and DLT pipelines.
· Design secure, scalable infrastructure for Delta Lake and Spark SQL.
· Optimize Databricks environments for cost-effectiveness.
· Extract complex business logic from on-prem solutions like SSIS, Informatica, Vertica, etc., into PySpark.
· Build analytical dashboards on top of the Databricks Lakehouse. Requirements - Must Have:
· 13+ years of overall experience, with at least 4+ years as a solution architect.
· Experience with Databricks, PySpark, and modern data platforms.
· Expertise in cloud-native data platform architecture and solution design.
· Familiarity with Agile/Scrum methodologies.
Experience in pre-sales and in the P&C insurance domain is an advantage.
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×