More jobs:
Data Engineer – AWS Databricks
Job in
Santa Clara, Santa Clara County, California, 95053, USA
Listed on 2026-02-21
Listing for:
novasoft
Full Time
position Listed on 2026-02-21
Job specializations:
-
IT/Tech
Data Engineer, Cloud Computing, Big Data, Data Science Manager
Job Description & How to Apply Below
Location: Santa Clara, CA
Experience: 10+ Years
Employment Type: Full-Time / Contract
We are seeking a highly experienced AWS Databricks Data Engineer to join our data engineering team in Santa Clara. The ideal candidate will have deep expertise in Databricks, AWS, PySpark, SQL, and large-scale data pipeline development
. This role focuses on designing and optimizing modern cloud-based data platforms that support analytics, BI, and enterprise reporting use cases.
You will collaborate with cross-functional teams and business stakeholders to deliver scalable, secure, and high-performance data solutions built on a Lakehouse architecture.
Key Responsibilities:- Design and maintain scalable ETL/ELT pipelines using Databricks on AWS
- Develop high-performance data transformations using PySpark and SQL
- Implement and optimize Lakehouse (Medallion) architecture for batch data processing
- Integrate data from S3, databases, and AWS-native services
- Optimize Spark workloads for performance, cost, and scalability
- Implement data governance and access controls using Unity Catalog
- Deploy and manage jobs using Databricks Workflows and CI/CD pipelines
- Collaborate with business and analytics teams to deliver reliable, production-ready datasets
Skills:
- Strong expertise in Databricks
:- Delta Lake
- Unity Catalog
- Lakehouse Architecture
- Workflows
- Delta Live Pipelines
- Table Triggers
- Databricks Runtime
- Advanced proficiency in PySpark and SQL
- Experience designing and rebuilding batch-heavy data pipelines
- Strong knowledge of Medallion Architecture
- Expertise in performance tuning and Spark optimization
- Experience with Databricks Workflows & orchestration
- Familiarity with Genie enablement concepts (working understanding required)
- Experience with CI/CD and Git-based development
- Strong AWS fundamentals:
- IAM
- Networking basics
- S3
- Glue Catalog
- Experience with Spark Structured Streaming
- Knowledge of real-time or near real-time data solutions
- Advanced Databricks Runtime configurations
- Experience with Git Lab CI/CD pipelines
- Exposure to scalable enterprise data architectures
- Databricks Certified Data Engineer (Associate/Professional)
- AWS Data Engineer or AWS Solutions Architect Certification
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×