×
Register Here to Apply for Jobs or Post Jobs. X

Databricks Subject Matter Expert; SME U.S. Citizenship

Job in Ashburn, Turner County, Georgia, 31714, USA
Listing for: Ignite IT
Full Time position
Listed on 2026-01-01
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Job Description & How to Apply Below
Position: Databricks Subject Matter Expert (SME) U.S. Citizenship Required
Location: Ashburn

Description

Ready to Ignite your career and work alongside some of the brightest and most innovative professionals in cutting-edge data and cloud technologies? Join us and unleash your potential in an Agile, mission-driven environment supporting the men and women dedicated to safeguarding the American people and enhancing the Nation’s safety, security, and prosperity.

As a Databricks Subject Matter Expert, you will lead modernization efforts across enterprise data platforms, migrating legacy environments into scalable cloud-native architectures. You will design, build, and optimize advanced data pipelines and analytics solutions using Databricks, PySpark, and modern cloud tools—driving real-time insights and powering mission‑critical applications. This role blends strategic architecture, hands‑on engineering, and collaboration across cross‑functional teams to deliver high‑impact outcomes for national security missions.

Your Role

  • Modernize the data warehouse environment by migrating the platform to Databricks.
  • Work with database developers and administrators across multiple product teams.
  • Serve as a data and technology expert across a broad and diverse set of mission critical applications.
  • Design, develop, and maintain robust and scalable data warehouse architectures and ETL/ELT data pipelines using Databricks, PySpark, Python, SQL.
  • Automate ETL/ELT data pipelines using Continuous integration, Continuous Deployment (CI/CD) tools and technologies.
  • Evaluating existing data sets and reporting architectures to identify strategic gaps and apply modern technologies to creatively achieve superior mission outcomes.
  • Analyze project‑related problems and create innovative solutions involving technology, analytic methodologies, and advanced solution components.
  • Optimize and troubleshoot data pipelines and warehouse performance to ensure efficient and reliable data processing.
  • Actively participate in Agile Scrum sprint planning, artifact creation, sprint testing, regression testing, demonstrations, retrospectives and solution releases.

Must be a U.S. Citizen with the ability to pass CBP background investigation, criteria includes but is not limited to:

  • 3 year check for felony convictions
  • 1 year check for illegal drug use
  • 1 year check for misconduct such as theft or fraud
  • 7+ years of professional experience working on complex data challenges in the areas of data architecture and engineering
  • 3–5 years of Databricks experience.
  • Alternative/equivalent technologies such as Snowflake, Google Big Query, or Microsoft Azure Synapse Analytics will also be considered.
  • Proven expertise with Databricks, including extensive hands‑on experience with PySpark, Python, SQL, Kafka, and Databricks notebooks.
  • Strong experience with data modeling techniques (e.g., dimensional modeling, data vault) and database design.
  • Experience building and optimizing data pipelines for batch and/or streaming data.
  • Experience with cloud platforms (e.g., AWS, Azure, GCP) and services related to data storage and processing (e.g., S3, ADLS).
  • Experience automating ELT data pipelines using Continuous Integration, Continuous Deployment (CI/CD) tools and technologies.
  • Strong software development background using Agile or Dev Ops methods and deep familiarity with cloud‑native technologies.
  • Candidates with one or more of the above skillsets are encouraged to apply.

Desired:

  • 5–10 years of DHS, DoD, or IC experience working in complex data environments, including the architecture and optimization of data schemas, terabyte‑scale ETL, etc.
  • 5–10 years of experience applying a range of analytical techniques including statistical, geospatial, link, temporal, and predictive analysis, for DHS, DoD, or IC agencies.
  • 3–5 years of experience building and implementing artificial intelligence, neural networks, deep learning, or machine learning capabilities in software applications in a national security or academic environment.
  • Exposure to implementing or migrating to Cloud environments like Amazon Web Services (AWS) or Microsoft Azure.
  • Previous experience as an Enterprise‑level Data Architect, Data Engineer, Data Scientist, or Data Analyst.
  • Ability to apply advanced principles, theories, and…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary