Senior Data Engineer
Listed on 2025-12-02
-
Software Development
Data Engineer, Software Engineer
Arbiter is on a bold mission to end healthcare fragmentation by unifying patients, providers, and payers on a single intelligent care orchestration platform. Our shared data spine integrates clinical, financial, and policy information to ensure patients receive the right care, in the right place, at the right time. By connecting the people and systems behind every care decision, Arbiter is transforming how healthcare is delivered.
Today’s fragmented healthcare system delays access to the care patients need and burdens providers with inefficiencies. Arbiter uses AI to close gaps in care, optimize referrals, accelerate authorizations, and improve site-of-care navigation. Backed by leaders from top healthcare and technology organizations, we partner with hospitals, health systems, payers, and enterprise clients to build the connected infrastructure healthcare has been missing.
Our Engineering Culture & ValuesWe are a high-performing group of engineers dedicated to delivering innovative, high-quality solutions to our clients and business partners. We believe in:
Engineering Excellence: Taking immense pride in our technical craft and the products we build, treating both with utmost respect and care.
Impact-Driven Development: Firmly committed to engineering high-quality, fault-tolerant, and highly scalable systems that evolve seamlessly with business needs, minimizing disruption.
Collaboration Over Ego: Valuing exceptional work and groundbreaking ideas above all else. We seek talented individuals who are accustomed to working in a fast-paced environment and are driven to ship often to achieve significant impact.
Continuous Growth: Fostering an environment of continuous learning, mentorship, and professional development, where you can deepen your expertise and grow your career.
As a Senior Data Engineer
, you will play a pivotal role in shaping our data ecosystem, enabling Arbiter's intelligent operating system:
Architect & Build: Design, develop, and maintain robust, scalable, and high-performance data processing systems (batch and/or real-time streaming) that power critical business functions, AI agents, and advanced analytics.
Technical Leadership: Lead complex data engineering initiatives from conception to deployment, ensuring data pipelines are reliable, efficient, testable, maintainable, and adhere to best practices for data ingestion from EMRs, claims, payor files, and payer policies.
Data Modeling & Governance: Drive the design of our enterprise data models for optimal storage, retrieval, and analytical performance, ensuring alignment with product, business, and regulatory requirements, including tracking RAF performance and HCC code detection.
Platform & Tooling: Champion and contribute to the development of core data platform tooling, frameworks, and standards that enhance developer productivity and data quality across the organization, supporting our AI agents and auditable systems.
Cross-Functional Collaboration: Partner closely with product managers, data scientists, software engineers, and other non-technical stakeholders to understand data needs, deliver impactful solutions, and provide expert data insights that drive the intelligent operating system.
Mentorship & Growth: Actively participate in mentoring junior data engineers, contributing to our team's growth through technical guidance, code reviews, and knowledge sharing.
Hiring & Onboarding: Play an active role in interviewing and onboarding new team members, helping to build a world-class data engineering organization.
8+ years of deep, hands‑on experience in Data Engineering, Data Infrastructure, or building Data Engineering Tools and Frameworks, ideally within a high‑growth tech environment.
Exceptional expertise in data structures, algorithms, and distributed systems.
Mastery in Python for large-scale data processing; experience with other languages like Java or Scala is a plus.
Extensive experience designing, building, and optimizing complex, fault‑tolerant data pipelines (both batch and real‑time streaming).
Profound understanding and hands‑on experience with cloud‑native data platforms, especially…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).