×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Databricks Test Engineer – Project Delivery Specialist

Job in Cincinnati, Hamilton County, Ohio, 45208, USA
Listing for: Relha LLC
Full Time position
Listed on 2026-02-16
Job specializations:
  • IT/Tech
    Data Engineer
Salary/Wage Range or Industry Benchmark: 60000 - 80000 USD Yearly USD 60000.00 80000.00 YEAR
Job Description & How to Apply Below

Are you an experienced, passionate pioneer in technology who wants to work in a collaborative environment? As an experienced Databricks Test Engineer you will have the ability to share new ideas and collaborate on projects as a consultant without the extensive demands of travel. If so, consider an opportunity with Deloitte under our Project Delivery Talent Model. Project Delivery Model (PDM) is a talent model that is tailored specifically for long-term, onsite client service delivery.

Act as the primary QA engineer for the Finance data domain, validating Databricks Medallion (Bronze/Silver/Gold) pipelines and ensuring delivered data products meet functional and data-quality expectations.

  • Design, build, and maintain automated test frameworks (unit/integration/data-quality/regression) using Python to expand coverage across PySpark and Databricks workflows.
  • Execute performance benchmarking and scalability testing for PySpark jobs (runtime, cost, and stability), partnering with data engineers to diagnose bottlenecks and verify improvements.
  • Communicate regularly with Engagement Managers, project team members, and representatives from various functional and/or technical teams, escalating any matters that require additional attention.
  • Independently and collaboratively lead client engagement work streams focused on improvement, optimization, and transformation of processes including implementing leading practice workflows, addressing deficits in quality, and driving operational outcomes.
Qualifications
  • 5+ years of QA/testing experience for Databricks data platforms, including strong working knowledge of Medallion architecture and PySpark-based pipelines.
  • Strong Python development skills, with experience building automated tests and test harnesses for data pipelines (including assertions, fixtures/test data management, and CI integration).
  • Deep understanding of data products and data-quality dimensions (completeness, accuracy, timeliness, reconciliation) and the ability to translate requirements into executable test cases.
  • Experience performing performance testing/benchmarking for distributed data processing (profiling, metrics/SLAs, and result reporting).
  • Bachelor’s degree, preferably in Computer Science, Information Technology, Computer Engineering, or related IT discipline; or equivalent experience.
  • Ability to travel 10%, on average, based on the work you do and the clients and industries/sectors you serve.
  • Limited immigration sponsorship may be available.

Information for applicants with a need for accommodation:

All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability or protected veteran status, or any other legally protected basis, in accordance with applicable law.

#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary