×
Register Here to Apply for Jobs or Post Jobs. X

Databricks Solutions Architect

Job in Chicago, Cook County, Illinois, 60290, USA
Listing for: Medium
Full Time position
Listed on 2025-12-01
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing, Data Science Manager, AI Engineer
Job Description & How to Apply Below

A BIT ABOUT WAVICLE

Wavicle Data Solutions is a founder‑led, high‑growth consulting firm helping organizations unlock the full potential of cloud, data, and AI. We’re known for delivering real business results through intelligent transformation—modernizing data platforms, enabling AI‑driven decision‑making, and accelerating time‑to‑value across industries.

At the heart of our approach is WIT
—the Wavicle Intelligence Framework. WIT brings together our proprietary accelerators, delivery models, and partner expertise into one powerful engine for transformation. It’s how we help clients move faster, reduce costs, and create lasting impact—and it’s where your ideas, skills, and contributions can make a real difference.

Our work is deeply rooted in strong partnerships with AWS, Databricks, Google Cloud, and Azure
, enabling us to deliver cutting‑edge solutions built on the best technologies the industry has to offer.

With over 500 team members across 42 cities in the U.S., Canada, and India, Wavicle offers a flexible, digitally connected work environment built on collaboration and growth.

We Invest in Our People Through
  • Competitive compensation and bonuses
  • Unlimited paid time off
  • Health, retirement, and life insurance plans
  • Long‑term incentive programs
  • Meaningful work that blends innovation and purpose

If you’re passionate about solving complex problems, exploring what’s next in AI, and being part of a team that values delivery excellence and career development—you’ll feel right at home here.

The Opportunity

Wavicle Data Solutions is hiring a Databricks Solution Architect, who will be responsible for leading design and implementation of scalable and optimized solutions that leverage the latest Databricks for features.

This individual will work closely with customers, understanding their needs and business drivers, and helping them adopt and optimize Databricks for their analytics, data science, and AI/ML workloads. They will provide thought and technical leadership, ensure best practices, and align customer strategies with Databricks’ offerings.

They will also be part of a team helping the company identify and build point of views for the market and determine our Databricks Goto‑Market strategy.

What You Will Get To Do
  • Solution Design:
    Develop data architecture solutions and reference architectures for customers using Databricks.
  • Customer Engagement:
    Work closely with clients to understand their business goals and technical needs, ensuring optimal use of the Databricks platform.
  • Pre‑Sales Support:
    Provide technical expertise during the pre‑sales process, conducting workshops and proof‑of‑concept (POC) projects.
  • Technical Leadership:
    Lead complex projects involving Databricks integration with cloud environments such as AWS, Azure, or GCP.
  • Performance Optimization:
    Help customers optimize performance for large‑scale data processing, streaming, and machine learning workflows on Databricks.
  • Training and Mentorship:
    Guide customers and internal teams on best practices in data engineering, machine learning, and AI using Databricks.
  • Collaboration:

    Work cross‑functionally with engineering, product, and sales teams to deliver successful implementations.
  • Market Strategy:
    Help the company identify market needs, define a Go To Market strategy, and execute on that strategy
What You Bring To The Team

Required Qualifications
  • Experience:

    10+ years of experience in data engineering, data architecture, or related fields. Hands‑on experience with Databricks is a must.
  • Cloud

    Experience:

    Proficiency in cloud platforms (AWS, Azure, GCP), with strong knowledge of cloud‑native technologies and services.
  • Programming

    Skills:

    Expertise in Python, Scala, and SQL for large‑scale data processing.
  • Data Engineering:
    Experience in designing and implementing data pipelines, ETL processes, Dev/Ops, Security/Ops and data lakes.
  • Machine Learning:
    Familiarity with ML workflows and libraries such as Tensor Flow, PyTorch, and scikit‑learn.
  • Big Data Tools:
    Strong experience with big data tools like Apache Spark, Hadoop, and Delta Lake.
  • Communication:
    Excellent communication and interpersonal skills to effectively engage with both technical and non‑technical stakeholders.
Pr…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary