×
Register Here to Apply for Jobs or Post Jobs. X

Cloud Databricks Architect

Job in Bengaluru, 560001, Bangalore, Karnataka, India
Listing for: ARA Resources Pvt. Ltd.
Full Time position
Listed on 2026-02-14
Job specializations:
  • IT/Tech
    Data Engineer, Data Science Manager
Job Description & How to Apply Below
Location: Bengaluru

About ARAs Client

ARAs Client is a global leader in end-to-end data and analytics solutions, with nearly two decades of experience helping enterprises unlock value from data-driven capabilities. Operating at the scale of a global consulting firm with the agility of a niche specialist, ARAs Client partners with customers worldwide on large-scale data modernization and analytics transformation initiatives.

With a strong presence in India and a growing global footprint, ARAs Client is known for its high-performance culture, reusable analytics accelerators, and focus on continuous learning and innovation.

Role Summary

Design and optimize cloud-native data architectures on platforms like Databricks and Snowflake, enabling scalable data engineering, advanced analytics, and AI/ML solutions aligned with business needs.

Key Responsibilities

- Design and implement Lakehouse architectures using Databricks, Delta Lake, and Apache Spark.
- Lead the development of data pipelines, ETL/ELT processes, and data integration strategies.
- Collaborate with business and technical teams to define data architecture standards, governance, and security models.
- Optimize performance and cost-efficiency of Databricks clusters and jobs.
- Provide technical leadership and mentorship to data engineers and developers.
- Integrate Databricks with cloud platforms (Azure, AWS, or GCP) and enterprise systems.
- Evaluate and recommend tools and technologies to enhance the data ecosystem.
- Ensure compliance with data privacy and regulatory requirements.
- Contribute to proposal and pre sales activities.

Must-Have Qualifications

- Expertise in data engineering, data architecture, or analytics.
- Hands-on experience on Databricks and Apache Spark.
- Hands-on experience on Snowflake
- Strong proficiency in Python, SQL, and PySpark.
- Deep understanding of Delta Lake, Lakehouse architecture, and data mesh principles.
- Deep understanding of Data Governance and Unity Catalog.

- Experience with cloud platforms (Azure preferred, AWS or GCP acceptable).

Good to have

Skills:

- Good understanding of the CI/CD pipeline

- Working experience with Git Hub
- Experience in providing data engineering solutions while maintaining balance between architecture requirements, required efforts and customer specific needs in other tools

Qualification:

- Bachelor’s degree in computer science, engineering, or related field
- Demonstrated continued learning through one or more technical certifications or related methods
- Over 10+ years of relevant experience in ETL tools
- Relevant experience in Retail domain
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary