Senior Data Engineer
Listed on 2026-02-16
-
IT/Tech
Data Engineer, AI Engineer, Cloud Computing
About Bestow
Life insurance is one of the world's most important products. It's also one of the hardest to build, distribute, and modernize. Bestow exists to change that.
Bestow is a leading vertical technology platform serving some of the largest and most innovative life insurers. Our platform unifies the fragmented, legacy value chain, enabling carriers to launch products in weeks instead of years. Carriers choose us to scale and operate at unprecedented speed, powered by AI and automation.
Bestow isn't selling policies. We're building the infrastructure that helps an entire industry move faster, reach more people, and deliver on its promise.
Backed by leading investors (Goldman Sachs, Hedosophia, NEA, Valar, 8VC) and trusted by major carriers, Bestow is powered by a team that moves with precision, purpose, and heart. If you want to help reimagine a centuries-old industry with lasting impact, join us.
Bestow offers flexible remote/hybrid work, meaningful benefits, equity, and substantial growth opportunities.
Bestow participates in the E-Verify Program.
The Senior Data Engineer at Bestow sits at the intersection of execution and architecture. You will not only build robust pipelines but also design scalable data models, design toward real-time data sharing systems, mentor junior engineers, and own critical components of the data platform. You will bridge the gap between "making it work" and "making it scale." This role reports to the Senior Manager of Analytics.
ABOUTTHE TEAM
The Bestow Data Engineering team plays a significant role within the organization, working across the entire company to provide scalable data solutions within the platform and toward integrations with external partners. The team works closely with the internal analytics, data science, and engineering teams to improve platform integrations, data architecture, and serve data science predictions.
WHAT YOU’LL DOTransform and build robust solutions for transferring data from first and third-party applications to and from our data warehouse
Envision and design toward industry patterns on data exchange with our enterprise clients through a mix of traditional push delivery, cloud, and event-driven (eg: API, grpc) data sharing methods.
Support implementation and data integration as new partners roll onto the Bestow platform, and recommend improvements on configurability of platform.
Making decisions as a team. The things you build will be maintained and improved upon by others; there is a shared responsibility to make defensible design considerations and high collaboration.
Champion test-first design principles, proactively writing tests before code to maintain high coverage and pipeline reliability.
Develop hardened and repeatable (CI/CD) data models and pipelines to enable reporting, modeling and machine learning.
Ensure data quality through automated monitoring and alerting, and occasionally serving within an on-call rotation.
Leverage Google Cloud (GCP) tools (eg: Cloud Run, Cloud Functions, Vertex AI, App Engine, Cloud Storage, IAM, etc.) and services (eg: Astronomer - Apache Airflow) to bring data workloads to production.
Drive and support MLOps to improve Data Science monitoring and governance
Enable and support Generative AI (eg: LLM) pipelines, allowing internal teams to quickly prototype. Support the architecture and rollout of GenAI products and features into the marketplace.
Collaborate with product, engineering, stakeholders and data teams to deliver informed solutions to platform and client needs
6+ years working in a data engineering role supporting incoming/outgoing products for internal and external customers.
4+ years demonstrated expertise in designing an end-to-end data pipeline in cloud frameworks (such as GCP, AWS, Azure) with requirements from multiple stakeholders.
4+ years of Python experience writing efficient, testable, and readable code
2+ years of experience in building streaming data ingestion pipelines
1+ year of ML (Machine Learning) support and implementation or MLOps.
Advanced SQL expertise with columnar databases (Big Query, Snowflake, Amazon Redshift) and performance tuning.
Demonstrated experience with AI Coding assistants – AI tools are heavily engrained in Bestow culture.
Cloud Native:
Deep experience with cloud services (GCP preferred: Cloud Run, Pub/Sub, Big Query) and containerization (Docker/Kubernetes).Demonstrated experience with AI Coding assistants – AI tools are heavily engrained in Bestow culture.
Orchestration:
Expert-level knowledge of Apache Airflow (DAG optimization, custom operators).Experience building CICD pipelines for data processing using tools such as Docker, Circle
CI, dbt, git, etcInfrastructure as Code:
Proven experience managing infrastructure using Terraform or Pulumi.Experience with creating alerts and monitoring pipelines which contribute to overall data governance.
Familiarity with standard IT security practices such as identity and access management (IAM), data protection, encryption,…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).