More jobs:
Data Architect
Job in
San Rafael, Marin County, California, 94911, USA
Listed on 2025-12-21
Listing for:
Incedo Inc.
Full Time
position Listed on 2025-12-21
Job specializations:
-
IT/Tech
Data Engineer
Job Description & How to Apply Below
Senior Talent Acquisition Specialist at Incedo - Hiring for Life Sciences
Role Overview
We are seeking a Data bricks Data Architect to support the design, implementation, and optimization of cloud-native data platforms built on the Data bricks Lakehouse Architecture. This is a hands-on, engineering-driven role requiring deep experience with Apache Spark, Delta Lake, and scalable data pipeline development, combined with early-stage architectural responsibilities.
The role involves close onsite collaboration with client stakeholders, translating analytical and operational requirements into robust, high-performance data architectures, while adhering to best practices for data modeling, governance, reliability, and cost efficiency.
Key Responsibilities- Design, develop, and maintain batch and near-real-time data pipelines using Databricks, PySpark, and Spark SQL
- Implement Medallion (Bronze/Silver/Gold) Lakehouse architectures, ensuring proper data quality, lineage, and transformation logic across layers
- Build and manage Delta Lake tables, including schema evolution, ACID transactions, time travel, and optimized data layouts
- Apply performance optimization techniques such as partitioning strategies, Z-Ordering, caching, broadcast joins, and Spark execution tuning
- Support dimensional and analytical data modeling for downstream consumption by BI tools and analytics applications
- Assist in defining data ingestion patterns (batch, incremental loads, CDC, and streaming where applicable)
- Troubleshoot and resolve pipeline failures, data quality issues, and Spark job performance bottlenecks.
- Exposure to Data bricks Unity Catalog, data governance, and access control models
- Experience with Data bricks Workflows, Apache Airflow, or Azure Data Factory for orchestration
- Familiarity with streaming frameworks (Spark Structured Streaming, Kafka) and/or CDC patterns
- Understanding of data quality frameworks, validation checks, and observability concepts
- Experience integrating Data bricks with BI tools such as Power BI, Tableau, or Looker
- Awareness of cost optimization strategies in cloud-based data platforms
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×