Big Data Architect IT Consultant Databricks
Listed on 2026-01-09
-
IT/Tech
Data Engineer, Data Analyst, Cloud Computing, Database Administrator
Overview
Role:
Big Data Architect IT Consultant Master. Client:
State of DC.
Location:
Washington, D.C.
This role will provide expertise to support the development of a Big Data / Data Lake system architecture that supports enterprise data operations for the District of Columbia government, including IoT / Smart City projects, enterprise data warehouse, open data portal, and data science applications. The architecture includes Databricks, Microsoft Azure platform tools (including Data Lake, Synapse), Apache tools (including Hadoop, Hive, Impala, Spark, Sedona, Airflow) and data pipeline/ETL tools (including Streamsets, Apache NiFi, Azure Data Factory).
The platform will be designed for District-wide use and integration with other OCTO Enterprise Data tools such as Esri, Tableau, Micro Strategy, API Gateways, and Oracle databases and integration tools.
Bachelor’s degree in Information Technology or related field or equivalent experience
Skills- Experience implementing Big Data storage and analytics platforms such as Databricks and Data Lakes
- Knowledge of Big Data and Data Architecture and Implementation best practices — 5 Years
- Knowledge of architecture and implementation of networking, security and storage on cloud platforms such as Microsoft Azure — 5 Years
- Experience with deployment of data tools and storage on cloud platforms such as Microsoft Azure — 5 Years
- Knowledge of Data-centric systems for the analysis and visualization of data, such as Tableau, Micro Strategy, ArcGIS, Kibana, Oracle — 10 Years
- Experience querying structured and unstructured data sources including SQL and No
SQL databases — 5 Years - Experience modeling and ingesting data into and between various data systems through the use of Data Pipelines — 5 Years
- Experience with implementing Apache data products such as Spark, Sedona, Airflow, Atlas, NiFi, Hive, Impala — 5 Years
- Experience with API / Web Services (REST/SOAP) — 3 Years
- Experience with complex event processing and real-time streaming data — 3 Years
- Experience with deployment and management of data science tools and modules such as Jupyter Hub — 3 Years
- Experience with ETL, data processing, analytics using languages such as Python, Java or R — 3 Years
- Experience with Cloudera Data Platform — 3 Years
- 16+ years planning, coordinating, and monitoring project activities
- 16+ years leading projects, ensuring they are in compliance with established standards/procedures
- Bachelor’s degree in IT or related field or equivalent experience
- Required
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).