×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer II

Job in Coos Bay, Coos County, Oregon, 97458, USA
Listing for: Bestow
Full Time position
Listed on 2026-01-01
Job specializations:
  • IT/Tech
    Data Engineer, AI Engineer
Job Description & How to Apply Below

ABOUT BESTOW

Life insurance is one of the world's most important products. It's also one of the hardest to build, distribute, and modernize. Bestow exists to change that.

Bestow is a leading vertical technology platform serving some of the largest and most innovative life insurers. Our platform unifies the fragmented, legacy value chain, enabling carriers to launch products in weeks instead of years. Carriers choose us to scale and operate at unprecedented speed, powered by AI and automation.

Bestow isn't selling policies. We're building the infrastructure that helps an entire industry move faster, reach more people, and deliver on its promise.

Backed by leading investors (Goldman Sachs, Hedosophia, NEA, Valar, 8VC) and trusted by major carriers, Bestow is powered by a team that moves with precision, purpose, and heart. If you want to help reimagine a centuries-old industry with lasting impact, join us.

Bestow offers flexible remote/hybrid work, meaningful benefits, equity, and substantial growth opportunities.

Bestow participates in the E-Verify Program.

ABOUT THE TEAM

The Bestow Data Engineering team plays a significant role within the organization, working across the entire company to provide scalable data solutions within the platform and toward integrations with external partners. The data engineering team works closely with internal analytics and data science team members in order to improve data architecture, rapidly prototype, and serve data science predictions. In addition, data engineers work closely with stakeholders, members from product, and engineering to design and launch new systems for extracting, transforming and storing data.

You’ll be called upon to improve Bestow’s data reliability, efficiency and quality and will be expected to scale your solutions to the cloud environment of a SaaS company, iterate quickly, and make pragmatic choices around what tools and technologies to adopt.

WHAT YOU’LL DO
  • Build robust solutions for transferring data from first and third-party applications to and from our data warehouse
  • Making decisions as a team. The things you build will be maintained and improved upon by others; there is a shared responsibility to make defensible design considerations and high collaboration.
  • Develop hardened and repeatable (CI/CD) data models and pipelines to enable reporting, modeling and machine learning.
  • Improve data availability to our enterprise clients through a mix of traditional push delivery, cloud, and event-driven (eg: API, grpc) data sharing methods.
  • Ensure data quality through automated monitoring and alerting, and occasionally serving within an on-call rotation.
  • Leverage Google Cloud (GCP) tools (eg: Cloud Run, Cloud Functions, Vertex AI, App Engine, Cloud Storage, IAM, etc.) and services (eg: Astronomer - Apache Airflow) to bring data workloads to production.
  • Drive and support MLOps to improve Data Science monitoring and governance
  • Enable and support Generative AI (eg: LLM) pipelines, allowing internal teams to quickly prototype. Support the architecture and rollout of GenAI products and features into the marketplace.
  • Collaborate with product, engineering, stakeholders and data teams to deliver informed solutions to platform and client needs.
WHO YOU ARE
  • 4+ years working in a data engineering role that supports incoming/outgoing products supporting internal and external customers.
  • 2+ years demonstrated expertise in designing an end-to-end data pipeline in cloud frameworks (such as GCP, AWS, Azure) with requirements from multiple stakeholders.
  • 2+ years of Python or similar experience writing efficient, testable, and readable code
  • 2+ years experience in building streaming data ingestion pipelines
  • 1+ year of ML (Machine Learning) support and implementation or MLOps.
  • Deep SQL experience with columnar databases such as Google Big Query, Snowflake, or Amazon Redshift
  • Demonstrated experience with AI Coding assistants – AI tools are heavily engrained in Bestow culture.
  • Experience building CICD pipelines for data processing using tools such as Docker, Circle

    CI, dbt, git, etc
  • Able to manage infrastructure using IAC tools such as Terraform or Pulumi
  • Experience with common data…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary