×
Register Here to Apply for Jobs or Post Jobs. X

Databricks Engineer

Job in Dublin, Franklin County, Ohio, 43016, USA
Listing for: Infoverity
Full Time position
Listed on 2025-12-02
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Job Description & How to Apply Below

Infoverity is seeking an experienced Databricks Engineer to join our dynamic consulting team, helping clients architect, build, and optimize modern data platforms. In this role, you will design and implement scalable data solutions, leveraging cloud technologies, specifically Databricks, to drive innovation in AI/ML, data warehousing, and data integration. As a key player at Infoverity you will collaborate with cross-functional teams, including data scientists, business analysts, and solution architects, to deliver high-impact data solutions tailored to our clients’ needs.

Key Responsibilities
  • Data Engineering & Architecture:
    Design and implement scalable, cloud-based data pipelines to ingest, transform, and store data from various sources.
  • Data Warehousing & Integration:
    Develop and optimize ETL/ELT workflows, ensuring efficient data movement between systems, specifically Databricks.
  • AI/ML Enablement:
    Work alongside data science teams to support feature engineering, model training, and ML operations (MLOps) in cloud environments.
  • Data Modeling & Governance:
    Develop data models, schemas, and best practices to ensure data integrity, consistency, and security.
  • Performance Optimization:
    Monitor, troubleshoot, and optimize query performance, ensuring scalability and efficiency.
  • Consulting & Client Engagement:
    Work directly with clients to understand business needs, recommend best practices, and deliver tailored data solutions.
  • Data Architecture Design:
    Design and implement scalable, high-performance data architectures using Databricks.
  • Requirement Gathering:
    Lead requirement-gathering sessions with clients and internal teams to understand business needs and define best practices for solutions including data workflows.
  • Cloud

    Collaboration:

    Collaborate with cloud platform teams to optimize data storage and retrieval in environments like AWS S3, Azure Data Lake, and Delta Lake, among others.
  • Workflow Optimization:
    Translate complex data processes, such as those in Alteryx and Tableau, into optimized Databricks workflows using PySpark and SQL.
  • Automation Development:
    Develop reusable automation scripts to streamline workflow migrations and improve operational efficiency.
  • Development Support:
    Provide hands-on development and troubleshooting support to ensure smooth implementation and optimal performance.
  • Governance & Best Practices:
    Partner with cross-functional teams to establish data governance frameworks, best practices, and standardized reporting processes.
  • Training & Support:
    Deliver training, documentation, and ongoing support to empower users and enhance organizational data literacy.
Requirements

Required Qualifications

  • Minimum of 2 to 3 years of experience in data engineering, data architecture, and data integration.
  • Strong expertise in Databricks, Snowflake, and/or Microsoft Fabric.
  • Proficiency in SQL, Python, Spark, and distributed data processing frameworks.
  • Experience with cloud platforms (Azure, AWS, or GCP) and their native data services.
  • Hands-on experience with ETL/ELT development, data pipelines, and data warehousing.
  • Knowledge of AI/ML workflows, including feature engineering and ML model deployment.
  • Strong understanding of data governance, security, and compliance best practices.
  • Excellent problem-solving, communication, and client-facing consulting skills.
  • Ability to work independently and as part of a team.
Preferred Qualifications
  • Certifications in Databricks, Snowflake, Microsoft Fabric, or a cloud platform (Azure, AWS, GCP).
  • Experience with Apache Airflow, dbt, Delta Lake, or similar data orchestration tools.
  • Familiarity with Dev Ops, CI/CD pipelines, and Infrastructure as Code (IaC) tools like Terraform.
Seniority level
  • Mid-Senior level
Employment type
  • Full-time
Job function
  • Information Technology
Industries
  • IT Services and IT Consulting

We’re helping you find new opportunities. Referrals increase your chances of interviewing at Infoverity.

#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary