More jobs:
Databricks Engineer
Job in
College Park, Prince George's County, Maryland, 20741, USA
Listed on 2026-02-14
Listing for:
KamisPro
Full Time
position Listed on 2026-02-14
Job specializations:
-
IT/Tech
Data Engineer, Data Science Manager, Data Analyst
Job Description & How to Apply Below
This is a long-term contract (approximately 12 months) and is hybrid to Adelphi, MD. A background check will be required.
The ideal candidate is a detail-oriented, analytical problem solver who enjoys tackling complex data challenges. They communicate clearly and collaborate effectively with cross-functional teams to deliver meaningful, data-driven solutions. They are adaptable, service-oriented, and curious, with a passion for modern data technologies and continuous improvement. Highly organized and proactive, they manage multiple priorities while maintaining a strong focus on quality, scalability, and innovation.
Key Responsibilities- Implement and optimize data models within Databricks to support efficient querying, analytics, and reporting.
- Design, develop, and maintain scalable ETL/ELT pipelines
, with a strong emphasis on dimensional modeling and data quality. - Partner with engineering teams and business stakeholders to gather requirements and deliver reliable, production-ready analytics solutions.
- Develop, optimize, and maintain SQL queries, notebooks, and scripts for data ingestion, transformation, and processing.
- Ensure data accuracy, consistency, and integrity through validation, monitoring, and cleansing processes.
- Create and maintain clear documentation for data pipelines, data models, and analytics solutions.
- Monitor, troubleshoot, and optimize data pipelines and workloads to ensure performance, reliability, and scalability.
- Stay current with industry trends, including AI-driven analytics
, semantic modeling, and emerging data engineering best practices.
- Hands-on experience designing, implementing, and operating solutions in Databricks
. - Strong understanding of ETL/ELT architectures
, data ingestion patterns, and data pipeline orchestration. - Proficiency in Python and/or Spark for large-scale data processing.
- Experience designing and implementing dimensional data models in lakehouse or modern data platform environments.
- Familiarity with AI-driven analytics platforms
, semantic modeling concepts, and exposure to NLP techniques. - Experience working in SAFe Agile or other scaled Agile frameworks
. - Solid understanding of data governance, security, and compliance best practices in global, multi-provider data environments.
- Bachelor’s degree in Computer Science, Data Science, Engineering
, or a related field preferred. - Master’s degree is a plus.
- Databricks and Azure certifications strongly preferred.
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×