Solutions Architect - Databricks, AWS, Snowflake
Listed on 2026-02-07
-
IT/Tech
AI Engineer, Data Engineer
Solution Architect – Data & AI Platforms (Databricks, AWS, Snowflake)
- Primary location is our Atlanta, GA office; expectation of 4 days per week in office and 1 day remote.
- This is a client‑facing solutioning role with approximately 50% travel for technical workshops, stakeholder discussions, and on‑site collaboration with customers.
This role focuses on designing secure, scalable, and high‑performance data and AI platforms across AWS, Snowflake, and Databricks that power advanced analytics, machine learning, and agentic AI solutions end‑to‑end. You will work closely with clients, engineering teams, and business stakeholders to define modern data and AI architectures, shape roadmaps, and guide implementation from early discovery through production.
As a Solution Architect in our Data & AI Solutions Team, you will be a key technical leader driving adoption of Databricks and cloud‑native AI capabilities. You will lead architectural assessments, propose target‑state designs for data, ML, and LLM/RAG workloads, and support deal cycles with clear, outcome‑oriented solution options, including value, cost, and risk trade‑offs. You will also contribute to the development and continuous improvement of our Data & AI platform offerings and go‑to‑market accelerators.
Your expertise in Spark, Databricks, and the AWS data stack will enable you to mentor others, standardize AI‑ready patterns, and help clients unlock business value from their data and models.
- Lead technology assessments and data/AI platform architecture designs across AWS, Snowflake, and Databricks, including batch, streaming, Lakehouse, feature stores, and ML/LLM serving patterns.
- Design, document, and validate end‑to‑end data and ML/LLM solutions covering ingestion, modeling, ETL/ELT, feature engineering, and performance optimization.
- Architect data and vector foundations and orchestration patterns that enable LLM, RAG, and agentic AI use cases on Databricks and cloud‑native services.
- Partner with business stakeholders to translate analytic, ML, and AI use cases into technical solutions, implementation roadmaps, and non‑functional requirements such as scalability, resilience, availability, and observability.
- Define and champion data and AI governance, security, and compliance practices (including model/data lineage, access control, and responsible‑AI guardrails) in collaboration with security, networking, and platform teams.
- Provide technical advisory services in deal cycles for data, ML, and GenAI initiatives, including discovery workshops, solution optioning, estimates, and proposal content tied to business value and ROI.
- Guide delivery teams during implementation of data, ML, and LLM/RAG solutions through backlog refinement, design reviews, and risk/issue resolution, ensuring adherence to reference architectures and standards.
- Mentor data engineers, ML engineers, and junior architects, promoting architectural excellence and best practices in modern data and AI engineering.
- Develop and maintain reference architectures, standards, and reusable accelerators for data platforms, MLOps, and GenAI workloads.
- Contribute to the design and evolution of our Data & AI platform offerings, including reusable blueprints, accelerators, and best‑practice implementation guides for analytics, ML, and LLM solutions.
- 8+ years in data architecture or engineering roles with hands‑on implementation experience on at least one major cloud (AWS).
- Deep, practical experience with Spark and the Databricks Lakehouse platform (Delta, cluster configuration, performance tuning).
- Strong proficiency with AWS data services such as S3, Glue, Redshift, Lambda, and related security/networking concepts.
- Practical expertise designing and operating Snowflake environments, including roles, warehouses, and cost/performance optimization.
- Strong SQL and solid Python skills for data transformation, orchestration, and solution prototyping.
- Proven track record delivering large‑scale, high‑volume data platforms that support analytics, ML, or AI workloads in production.
- Familiarity with modern data engineering tooling (for example dbt, Airflow or similar orchestrators, CI/CD for data solutions).
- Nice to have:
Working literacy in Power BI or Tableau to ensure data models are BI‑friendly and support downstream reporting needs. - Excellent client‑facing communication skills and the ability to lead technical discussions with architects, engineers, data scientists, and business leaders.
- Alignment with consulting values: people‑first mindset, high character and hard work, and a bias toward ownership and clarity.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).