More jobs:
Databricks Architect
Job in
Chicago, Cook County, Illinois, 60290, USA
Listed on 2026-01-01
Listing for:
AAAI Press
Full Time
position Listed on 2026-01-01
Job specializations:
-
IT/Tech
Data Engineer, Data Science Manager
Job Description & How to Apply Below
Databricks Architect
Job Description Summary
Cushman & Wakefield is seeking a highly skilled Architect – Data Engineering to play a pivotal role in designing scalable, secure, and high‑performing data solutions across the enterprise. Reporting to the Director of Data Engineering & Analytics, this role will focus on architecting the next generation of data platforms, driving the adoption of modern technologies, and ensuring alignment with enterprise goals and governance standards.
As part of Cushman & Wakefield’s broader mission to deliver exceptional value through real estate services, this role serves as a hands‑on architect who blends technical depth with strategic insight to shape data engineering, analytics, and AI/ML capabilities in unlocking the power of data to inform decisions, optimize performance, and create competitive advantage for clients and internal stakeholders.
Key Responsibilities
• Design and lead end‑to‑end architecture for modern, cloud‑native data engineering, AI/ML and analytics platforms across the full data lifecycle, including ingestion, storage, transformation, and consumption.
• Architect high‑performance data solutions using Azure Databricks, Power BI, Tableau, Python, and other relevant technologies.
• Collaborate with technology leadership and engineering teams to align solutions with enterprise strategy, business goals, and innovation roadmaps.
• Define and enforce standards for data quality, metadata, lineage, and governance in partnership with data governance and MDM teams using tools like Profisee.
• Provide architectural guidance for AI/ML integrations, including data preparation, feature engineering, and model deployment support.
• Conduct design reviews, architectural assessments, and performance tuning to ensure system reliability, scalability, and maintainability.
• Develop and maintain reusable patterns, frameworks, and coding standards in Python, PySpark, and SQL.
• Collaborate with product managers, engineering leads, analysts, and data scientists to deliver high‑impact, cross‑functional solutions.
• Drive the evaluation and adoption of emerging technologies in cloud data platforms, streaming analytics, and intelligent automation.
• Mentor data engineers and oversee best practices in solution design, code quality, documentation, and continuous improvement.
• Support Dev Ops and Data Ops initiatives by integrating CI/CD pipelines, Git workflows, and automated testing into engineering workflows.
Qualifications
• Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Engineering, or a related field.
• 10+ years of progressive experience in data engineering, data modelling including at least 5 years in an architecture‑focused or lead technical role.
• Proven experience architecting enterprise‑grade cloud data platforms using Azure.
• Experience with building Databricks Delta Lake based Lakehouse using DLT, PySpark Jobs, Databricks Workflows, Unity Catalog and Medallion architecture from the ground up.
• Strong understanding of data security principles and best practices.
• Experience with Databricks monitoring and performance tuning tools.
• Hands‑on experience integrating with data governance and master data management platforms such as Profisee.
• Solid understanding of Dev Ops and Infrastructure‑as‑Code practices, including CI/CD pipelines, Docker/Kubernetes, and automated deployment frameworks.
Preferred Qualifications
• Familiarity with modern data architecture frameworks, including data mesh, data fabric, and data lakehouse.
• Industry experience in commercial or retail real estate, capital projects, or other highly data‑driven domains.
• Experience with Database Lifecycle Management (DLM) tools and strong understanding of CI/CD pipelines, branching strategies, and collaborative Dev Ops practices.
• Proficiency in Python, Scala, or similar programming languages commonly used in large‑scale data engineering.
• Understanding of ML/AI model lifecycle architecture, including data preparation, model training, and production deployment best practices.
• Certification:
Databricks Certified Data Engineer Associate
Cushman & Wakefield also provides…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×