×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Senior Data Engineer

Job in Coos Bay, Coos County, Oregon, 97458, USA
Listing for: Apollo
Full Time position
Listed on 2025-12-31
Job specializations:
  • IT/Tech
    Data Engineer
Job Description & How to Apply Below

Apollo.io is the leading go-to-market solution for revenue teams, trusted by over 500,000 companies and millions of users globally, from rapidly growing startups to some of the world's largest enterprises. Founded in 2015, the company is one of the fastest growing companies in SaaS, raising approximately $250 million to date and valued at $1.6 billion. Apollo.io provides sales and marketing teams with easy access to verified contact data for over 210 million B2B contacts and 35 million companies worldwide, along with tools to engage and convert these contacts in one unified platform.

By helping revenue professionals find the most accurate contact information and automating the outreach process, Apollo.io turns prospects into customers. Apollo raised a series D in 2023 and is backed by top-tier investors, including Sequoia Capital, Bain Capital Ventures, and more, and counts the former President and COO of Hub Spot, JD Sherman, among its board members.

Overview

As a Senior Software Engineer, you will play a key role in designing and building the foundational data infrastructure and APIs that power our analytics, machine learning, and product features. You’ll be responsible for developing scalable data pipelines, managing cloud-native data platforms, and creating high-performance APIs using FastAPI to enable secure, real-time access to data services. This is a hands-on engineering role with opportunities to influence architecture, tooling, and best practices across our data ecosystem.

Daily

Adventures and Responsibilities
  • Architect and build robust, scalable data pipelines (batch and streaming) to support a variety of internal and external use cases
  • Develop and maintain high-performance APIs using FastAPI to expose data services and automate data workflows
  • Design and manage cloud-based data infrastructure, optimizing for cost, performance, and reliability
  • Collaborate closely with software engineers, data scientists, analysts, and product teams to translate requirements into engineering solutions
  • Monitor and ensure the health, quality, and reliability of data flows and platform services
  • Implement observability and alerting for data services and APIs (think logs, metrics, dashboards)
  • Continuously evaluate and integrate new tools and technologies to improve platform capabilities
  • Contribute to architectural discussions, code reviews, and cross-functional projects
  • Document your work, champion best practices, and help level up the team through knowledge sharing
Competencies
  • Excellent communication skills to work with engineering, product, and business owners to develop and define key business questions and build data sets that answer those questions.
  • Self-motivated and self-directed
  • Inquisitive, able to ask questions and dig deeper
  • Organized, diligent, and great attention to detail
  • Acts with the utmost integrity
  • Genuinely curious and open; loves learning
  • Critical thinking and proven problem-solving skills required
  • Proven experience leveraging AI tools to enhance software development processes, including code generation, debugging, and productivity optimization.
  • Candidates should demonstrate fluency in integrating AI-driven solutions into their workflows and a willingness to stay current with emerging AI technologies
Skills & Relevant Experience

Required:

  • 5+ years experience in platform engineering, data engineering or in a data facing role
  • Experience in building data applications
  • Deep knowledge of data eco system with an ability to collaborate cross-functionally
  • Bachelor's degree in a quantitative field (Physical / Computer Science, Engineering or Mathematics / Statistics)

Preferred:

  • Experience using the Python data stack
  • Experience deploying and managing data pipelines in the cloud
  • Experience working with technologies like Airflow, Hadoop and Spark
  • Understanding of streaming technologies like Kafka, Spark Streaming

The listed pay range reflects base salary range, except for sales roles; the range provided is the role’s On Target Earnings (OTE) range, meaning that the range includes both the sales commission/sales bonus targets and annual base salary for the role. This pay range may be inclusive of several career levels at Apollo…

Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary