Ruby Engineer - Web Scraping; Remote
Town of Poland, Jamestown, Chautauqua County, New York, 14701, USA
Listed on 2026-02-16
-
IT/Tech
Web Developer
Location: Town of Poland
1 week ago Be among the first 25 applicants
Ruby Engineer (Web Scraping)We are looking for engineers with deep technical skills in browser automation, CDP internals, anti‑bot evasion, concurrency, or infrastructure. If you’re not sure you fit but you’re a strong problem solver, apply anyway.
About SearchAPISearch
API is a real‑time SERP API delivering structured data from 100+ search engines and sources, including Google Search, Google Shopping, Google Jobs, Bing, Baidu, You Tube, Amazon, and many more. We power production workloads for Fortune 500 companies and fast‑moving startups who need reliable search data ’re a lean, profitable, bootstrapped team. No VC pressure, no bloat. Just engineers shipping real products to real customers.
Join Us
- Real Impact:
Small team, massive scale. Your code runs in production serving billions of requests, powering tools you’ve probably used. - 100+ APIs:
Google, Bing, Baidu, You Tube, Amazon, and growing. - Open Source First:
Lang Chain, Hay Stack, Flowise, Lang Flow, Dify integrations. - Fortune 500 Customers:
Our API powers production workloads, not just pilots and experiments. - Bootstrapped and Profitable:
We answer to customers, not investors.
API Values
- We do everything the Rails Way. If you don’t like DHH’s style, this may not be the place for you.
- We embrace the one‑person framework.
- We hire Managers of One. We trust you to figure it out.
- Open Source contributor.
- Remote‑only. Async‑first. Results‑driven.
- Transparency. No politics.
- Ruby on Rails 8.1
- Ruby 3.4
- Hotwire (Stimulus.
JS + Turbo) - Tailwind
CSS (Tailwind
UI components) - PostgreSQL
- Redis
- Sidekiq
- Terraform + AWS
We use Cursor, Claude, ChatGPT, Intercom, Git Hub, Chrome Developer Tools, and Slack daily. We ship multiple times a day with CI/CD.
What You’ll Do- Fix broken parsers under time pressure.
- Add new elements to existing search engines.
- Build and ship new search engine integrations.
- Reverse engineer website protections and anti‑bot systems.
- Debug browser automation issues (CDP, fingerprinting, evasion).
- Create and improve documentation pages.
- Develop landing pages and admin dashboard features.
- Review PRs and help test.
- Talk to customers directly. Help them figure things out, brainstorm solutions, identify what’s missing. Learn the APIs and websites we scrape yourself so you can actually help.
- Rotate on customer support. Everyone does it, including senior engineers.
- Proactively update customers on progress and ship what they need.
- Improve browser automation and debug performance at scale.
- Optimize concurrency: fibers, ractors, threads.
- Handle complex challenges: TLS fingerprinting, JA3, WebRTC, CDP internals.
- Improve API monitoring: logging, real‑time analytics, anomaly detection.
- Architect and build new systems from scratch.
- Lead technical decisions and mentor engineers.
- Things break without warning. Search engines change layouts, anti‑bot systems evolve, proxies fail. You’ll debug production issues under pressure.
- No hand‑holding. We don’t assign tasks or write detailed specs. You identify problems and fix them.
- Customer‑facing. You’ll rotate on support. You’ll talk to customers. You’ll ship what they need and tell them when it’s done.
- Small team, big scope. There’s no one to hand things off to. You own it end‑to‑end.
- Written communication is everything. Remote‑first means if you can’t write clearly, you’ll struggle.
- Fast pace. We ship daily. We expect results, not activity.
- Grinder. You don’t wait for instructions. You find work that matters and do it.
- Results‑driven. You ship. You measure. You iterate.
- Great writer. Code, PRs, docs, customer messages. Writing is thinking.
- Strong work ethic. Startup pace. We work hard because that’s how you win against incumbents.
- Specialist with range. We need depth in hard technical areas: CDP, anti‑bot, concurrency, browser internals. You can collaborate across the stack, but you go deep where it matters.
- Passionate about Ruby & Ruby on Rails.
- Deep knowledge of web fundamentals: HTTP, TLS, CSS selectors, XPath, JavaScript.
- Experience with browser automation, scraping, and data extraction.
- Fluent in English, written and spoken.
- BSc or higher in CS or…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).