×
Register Here to Apply for Jobs or Post Jobs. X

Sr. Specialist Solutions Architect - AWS Partner Solution Architect

Job in Berkeley, Alameda County, California, 94709, USA
Listing for: Databricks
Full Time position
Listed on 2026-01-01
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Job Description & How to Apply Below
Overview

As a Specialist Solutions Architect (SSA) - Data Intelligence Platform, you will guide partners in understanding our platform and articulate integrations with AWS native services to build big data solutions on Databricks that span a large variety of use cases. You will be in a partner-facing role, working with and supporting our field Solution Architects and partner teams, that requires hands-on production experience with AWS, SQL, Apache Spark and expertise in other data technologies.

SSAs help partners build capabilities for design and successful implementation of essential workloads while aligning their technical roadmap for expanding the usage of the Databricks Intelligence Platform. As a deep go-to-expert reporting to the Field Engineering Leadership, you will continue to strengthen your technical skills through mentorship, learning, and internal training programs and establish yourself in an area of specialty - whether that be data governance, data science, machine learning, streaming, performance tuning, industry expertise, or more.

The Impact You Will Have

• Drive adoption and grow knowledge of Databricks products and accelerators on AWS by energizing the ecosystem of system integration partners, AWS technical field consultants, and Databricks direct field

• Provide tutorials and training to improve partner community adoption (including workshops, hackathons, and conference presentations)

• Translate field trends, AWS priorities, and Databricks product strategy into a cohesive story with clear call out for where we leverage both sides to build customer value in order to deliver that story

• Provide technical leadership to guide strategic partners to successful implementations on big data projects, ranging from architectural design to data engineering to model deployment

• Demonstrate thought leadership and translate customer adoption patterns from the field to collaborate with product teams to consider integrations with AWS

• Become a technical expert in an area such as the open Lakehouse, big data streaming, or data ingestion and workflows

• Assist Solution Architects with aspects of the technical sale as they work alongside partners including customizing proof of concept content, and architectures

• Contribute to the Databricks Community

What We Look For

• 5+ years experience in a technical role with expertise in at least one of the following on AWS:

• Software Engineering/Data Engineering: data ingestion, streaming technologies - such as Spark Streaming and Kafka, performance tuning, troubleshooting, and debugging Spark or other big data solutions.

• Data Applications Engineering:
Build use cases that use data - such as risk modeling, fraud detection, partner life-time value.

• Data Science or Machine Learning Ops:
Design and build of production infrastructure, model management, and deployment of advanced analytics that drives measurable business value (ie. getting models running in production).

• Must be able to work collaboratively and independently to achieve outcomes supporting go-to-market priorities and have the interpersonal savvy to influence both partners and internal stakeholders without direct authority

• Deep Specialty Expertise in at least one of the following areas:

• Expertise in data governance systems and solutions that may span technologies such as Unity Catalog, Alation, Collibra, Purview, etc.

• Experience with high-performance, production data processing systems (batch and streaming) on distributed infrastructure.

• Experience building large-scale real-time stream processing systems; expertise in high-volume, high-velocity data ingestion, change data capture, data replication, and data integration technologies.

• Experience migrating and modernizing Hadoop jobs to public cloud data lake platforms, including data lake modeling and cost optimization.

• Expertise in cloud data formats like Delta and declarative ETL frameworks like DLT.

• Expertise in building GenAI solutions such as RAG, Fine tuning, or Pre-training for custom model creation.

• Bachelor's degree in Computer Science, Information Systems, Engineering, or equivalent experience through work experience.

• Maintain…
Position Requirements
5+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary