×
Register Here to Apply for Jobs or Post Jobs. X

Senior Software Engineer, Data Acquisition Product & Engineering - Remote

Remote / Online - Candidates ideally in
California, Moniteau County, Missouri, 65018, USA
Listing for: People Data Labs, Inc.
Full Time, Remote/Work from Home position
Listed on 2025-12-10
Job specializations:
  • Software Development
    Data Engineer, Software Engineer
Salary/Wage Range or Industry Benchmark: 125000 - 150000 USD Yearly USD 125000.00 150000.00 YEAR
Job Description & How to Apply Below
Position: Senior Software Engineer, Data Acquisition Product & Engineering - Remote - Full Time
Location: California

Note for all engineering roles: with the rise of fake applicants and AI-enabled candidate fraud, we have built in additional measures throughout the process to identify such candidates and remove them.

About Us

People Data Labs (PDL) is the provider of people and company data. We do the heavy lifting of data collection and standardization so our customers can focus on building and scaling innovative, compliant data solutions. Our sole focus is on building the best data available by integrating thousands of compliantly sourced datasets into a single, developer-friendly source of truth. Leading companies across the world use PDL’s workforce data to enrich recruiting platforms, power AI models, create custom audiences, and more.

We are looking for individuals who can balance extreme ownership with a "one-team, one-dream" mindset. Our customers are trying to solve complex problems, and we only help them achieve their goals as a team. Our Data Engineering & Acquisition Team ensures our customers have standardized and high quality data to build upon.

You will be crucial in accelerating our efforts to build standalone data products that enable data teams and independent developers to create innovative solutions at massive scale. In this role, you will be working with a team to continuously improve our existing datasets as well as pursuing new ones. If you are looking to be part of a team discovering the next frontier of data-as-a-service (DaaS) with a high level of autonomy and opportunity for direct contributions, this might be the role for you.

We like our engineers to be thoughtful, quirky, and willing to fearlessly try new things. Failure is embraced at PDL as long as we continue to learn and grow from it.

What You Get to Do
  • Contribute to the architecture and improvement of our data acquisition and processing platform, increasing reliability, throughput, and observability
  • Use and develop web crawling technologies to capture and catalog data on the internet
  • Build, operate, and evolve large-scale distributed systems that collect, process, and deliver data from across the web
  • Design and develop backend services that manage distributed job orchestration, data pipelines, and large-scale asynchronous workloads
  • Structure and model captured data, ensuring high quality and consistency across datasets
  • Continuously improve the speed, scalability, and fault-tolerance of our ingestion systems
  • Partner with data product and engineering teams to design and implement new data products powered by the data you help collect, and enhance and improve upon existing products
  • Learn and apply domain-specific knowledge in web crawling and data acquisition, with mentorship from experienced teammates and access to existing systems
The Technical Chops You’ll Need
  • 7+ years of professional experience building or operating backend or infrastructure systems at scale
  • Solid programming experience in Python, Go, Rust, or similar, including experience with async / await, coroutines, or concurrency frameworks
  • Strong grasp of software architecture and backend fundamentals; you can reason clearly about concurrency, scalability, and fault tolerance
  • Solid understanding of browser rendering pipeline, web application architecture (auth, cookies, http request / response)
  • Familiarity with network architecture and debugging (HTTP, DNS, proxies, packet capture and analysis)
  • Solid understanding of distributed systems concepts: parallelism, asynchronous programming, back pressure, and message-driven design
  • Experience designing or maintaining resilient data ingestion, API integration, or ETL systems
  • Proficiency with Linux / Unix command-line tools and system resource management
  • Familiarity with message queues, orchestration, and distributed task systems (Kafka, SQS, Airflow, etc.)
  • Experience evaluating and monitoring data quality, ensuring consistency, completeness, and reliability across releases
People Thrive Here Who Can
  • Work independently in a fast-paced, remote-first environment, proactively unblocking themselves and collaborating asynchronously
  • Communicate clearly and thoughtfully in writing (Slack, docs, design proposals)
  • Write and maintain technical design…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary