×
Register Here to Apply for Jobs or Post Jobs. X

Databricks Solutions Engineer

Job in Reston, Fairfax County, Virginia, 22090, USA
Listing for: ICF
Full Time position
Listed on 2025-12-25
Job specializations:
  • IT/Tech
    Data Engineer, Data Science Manager, Data Analyst, Cloud Computing
Job Description & How to Apply Below

Job Location

This position is fully remote with up to 10% travel to the DC Metropolitan area for client meetings.

What you’ll be doing
  • Enable secure, scalable, and efficient data exchange between federal client and external data sharing partners using Databricks Delta Sharing.
  • Support the design and development of data pipelines and ETL routines in Azure Cloud environment for many source system types including RDBMS, API, and unstructured data using CDC, incremental, and batch loading techniques.
  • Conduct data profiling, transformation, and quality assurance on structured, semi-structured, and unstructured data.
  • Identify underlying issues and translate them into technical requirements.
  • Assist in building and optimizing data lakes, feature stores, and data warehouse structures to support analytics and machine learning.
  • Prepare, structure, and validate data for data science and MLOps workflows, ensuring it meets the quality and format requirements for modeling.
  • Help monitor and maintain the flow of data across BI dashboards, analytics environments, and machine learning pipelines.
  • Engage directly with clients and stakeholders to understand data needs and translate them into scalable solutions.
  • Collaborate with UX designers, business analysts, developers, and end users to define data and reporting requirements
  • Work with external data partners to determine their data product needs and work within the Databricks platform to enable rapid prototyping and extensible use cases
  • Meet with government employees at executive levels, platform stakeholders, and vendor partners.
  • Work within Agile teams to support iterative development, backlog grooming, and sprint-based delivery.
  • Provide mentorship to junior resources
What you must have
  • Bachelor’s degree in computer science, Information Systems, Data Analytics, or a related discipline.
  • Minimum 5+ years in data engineering, data security practices, data platforms, and analytics
  • 3+ years Databricks Platform Expertise – SME Level Proficiency including:
    • Databricks, Delta Lake, and Delta Sharing
    • Deep experience with distributed computing using Apache Spark
    • Knowledge of Spark runtime internals and optimization
    • Ability to design and deploy performant end-to-end data architectures
  • 4+ years of ETL Pipeline Development building robust, scalable data pipelines
  • Candidate must be able to obtain and maintain a Public Trust
  • Candidate must reside in the U.S., be authorized to work in the U.S., and all work must be performed in the U.S.
  • Candidate must have lived in the U.S. for three (3) full years out of the last five (5) years
Technologies you’ll use
  • Databricks on Azure for data engineering and ML pipeline support.
  • SQL, Python, Spark, Tableau.
  • Git, Jira, CI/CD tools (e.g., Jenkins, Code Build).
  • Jira, Confluence, SharePoint.
  • Mural, Miro, or other collaboration/whiteboarding tools.
What we’d like you to have
  • Databricks certifications - Professional or specialty certifications
  • Hands-on experience with Azure services such as Synapse, Data Factory, or Databricks.
  • Familiarity with data visualization tools such as Tableau, Power BI, or similar.
  • Machine Learning and Analytical Skills including:
  • MLOps - Working knowledge of ML deployment and operations
  • Data Science Methodologies - Statistical analysis, modeling, and interpretation

    Big Data Technologies - Experience beyond Spark with distributed systems
  • Experience with deployment pipelines, including Git-based version control and CI/CD pipelines and Dev Ops practices using Terraform for IaC.
  • Emergency management domain knowledge a plus
  • Advanced proficiency in data engineering and analytics using Python, Expert-level SQL skills for data manipulation and analysis and experience with Scala, preferred but not required (Python expertise can substitute)
  • Proven experience breaking down complex ideas into manageable components
  • Demonstrable experience developing rapid POCs and Prototypes
  • History of staying current with evolving data technologies and methodologies
Professional Skills
  • Strong analytical thinking, attention to detail, and willingness to learn new tools and technologies.
  • Consulting experience with ability to work directly with clients, executive level stakeholders and manage…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary