×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer

Job in Austin, Travis County, Texas, 78716, USA
Listing for: Interactive Resources - iR
Full Time position
Listed on 2026-01-01
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Job Description & How to Apply Below

Interactive Resources - iR provided pay range

This range is provided by Interactive Resources - iR. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more.

Base pay range

$/yr - $/yr

Additional compensation types

Annual Bonus

About the Role

We are seeking a highly skilled Databricks Data Engineer with strong expertise in modern data engineering, Azure cloud technologies, and Lakehouse architectures. This role is ideal for someone who thrives in dynamic environments, enjoys solving complex data challenges, and can lead end-to-end delivery of scalable data solutions.

What We're Looking For
  • 8+ years designing and delivering scalable data pipelines in modern data platforms
  • Deep experience in data engineering, data warehousing, and enterprise‑grade solution delivery
  • Ability to lead cross‑functional initiatives in matrixed teams
  • Advanced skills in SQL, Python
    , and ETL/ELT development
    , including performance tuning
  • Hands‑on experience with Azure
    , Snowflake
    , and Databricks
    , including system integrations
Key Responsibilities
  • Design, build, and optimize large‑scale data pipelines on the Databricks Lakehouse platform
  • Modernize and enhance cloud‑based data ecosystems on Azure
    , contributing to architecture, modeling, security, and CI/CD
  • Use Apache Airflow and similar tools for workflow automation and orchestration
  • Work with financial or regulated datasets while ensuring strong compliance and governance
  • Drive best practices in data quality, lineage, cataloging, and metadata management
Primary Technical Skills
  • Develop and optimize ETL/ELT pipelines using Python, PySpark, Spark SQL
    , and Databricks Notebooks
  • Design efficient Delta Lake models for reliability and performance
  • Implement and manage Unity Catalog for governance, RBAC, lineage, and secure data sharing
  • Build reusable frameworks using Databricks Workflows, Repos, and Delta Live Tables
  • Create scalable ingestion pipelines for APIs, databases, files, streaming sources
    , and MDM systems
  • Automate ingestion and workflows using Python and REST APIs
  • Support downstream analytics for BI, data science, and application workloads
  • Write optimized SQL/T‑SQL queries, stored procedures, and curated datasets
  • Automate Dev Ops workflows, testing pipelines, and workspace configurations
Additional Skills
  • Orchestration: Apache Airflow (plus)
  • MDM: Profisee (nice‑to‑have)
Soft Skills
  • Strong analytical and problem‑solving mindset
  • Excellent communication and cross‑team collaboration
  • Detail‑oriented with a high sense of ownership and accountability
Benefits
  • Medical insurance
  • Vision insurance
  • 401(k)
  • Paid maternity leave
  • Paid paternity leave
Seniority level

Mid‑Senior level

Employment type

Full‑time

Job function

Information Technology

Referrals increase your chances of interviewing at Interactive Resources - iR by 2x

#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary