More jobs:
Job Description & How to Apply Below
Qualifications:
Experience:
12 - 18 Years
Location:
PAN India
Education :
Qualified Bachelors/ Masters from Teir 1 Colleges
About the Company
ARAs Client is a global leader in technology consulting and digital transformation, helping organizations architect and execute large-scale applications at enterprise scale. With deep expertise across cloud solutions, business consulting, and engineering, ARAs Client helps Fortune-level companies modernize applications, optimize processes, and unlock data-driven value across industries.
About the Role
As a Data Platform Engineer/Lead, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture strategy. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives.
Responsibilities
- Expected to be a Subject Matter Expert with deep knowledge and experience.
- Should have influencing and advisory skills.
- Responsible for team decisions.
- Engage with multiple teams and contribute on key decisions.
- Expected to provide solutions to problems that apply across multiple teams.
- Facilitate workshops and discussions to gather requirements and feedback from stakeholders.
- Continuously evaluate and improve data architecture practices to enhance efficiency and effectiveness.
Required Skills
- Design and build complex pipelines using Delta Lake, Auto Loader, Delta Live Tables (DLT), and deployment using Asset Bundles.
- Proven experience as a Data Architect and Data Engineer leading enterprise-scale Lakehouse initiatives.
- Expert-level understanding of modern Data & Analytics Architecture patterns including Data Mesh, Data Products, and Lakehouse Architecture.
- Excellent programming and debugging skills in Python.
- Strong experience with PySpark for building scalable and modular ETL/ELT pipelines.
- Architect data ingestion and transformation using DLT Expectations, modular Databricks Functions, and reusable pipeline components.
- Must have hands-on expertise in at least one major cloud platform: AWS, GCP, or Azure.
- Lead implementation of Unity Catalog: create catalogs, schemas, role-based access policies, lineage visibility, and data classification tagging (PII, PHI, etc.).
- Guide organization-wide governance via Unity Catalog setup: workspace linkage, SSO, audit logging, external locations, and Volume access.
- Enable cross-platform data access using Lakehouse Federation, querying live from externally hosted databases.
- Leverage and integrate Databricks Marketplace to consume high-quality third-party data and publish internal data assets securely.
- Experience with cloud-based services relevant to data engineering, data storage, data processing, data warehousing, real-time streaming, and serverless computing.
- Govern and manage Delta Sharing for securely sharing datasets with external partners or across tenants.
- Design and maintain PII anonymization, tokenization, and masking strategies using dbx functions and Unity Catalog policies to meet GDPR/HIPAA compliance.
- Architect Power BI, Tableau, and Looker integration with Databricks for live reporting and visualization over governed datasets.
- Build Databricks SQL Dashboards to enable stakeholders with real-time insights, KPI tracking, and alerts.
- Hands on Experience in applying Performance optimization techniques.
- Lead cross-functional initiatives across data science, analytics, and platform teams to deliver secure, scalable, and value-aligned data products.
- Provide thought leadership on adopting advanced features like Mosaic AI, Vector Search, Model Serving, and Databricks Marketplace publishing.
- Working knowledge of DBT (Data Build Tool) is a plus.
- Strong background in data modelling and data warehousing concepts is required.
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×