Sr SDET
Listed on 2025-12-28
-
Software Development
DevOps, Software Engineer, Software Testing, Cloud Engineer - Software
The Mission:
Why We Exist, What We Do, And Where You Fit
Our client helps cities and counties modernize their plan review processes. Our software platform replaces manual, paper-heavy workflows with a faster, more transparent system for reviewing and approving building plans, empowering agencies to reduce permit turnaround times and improve service to their communities.
We are at a critical moment:
With a fresh investment from a private equity partner, we know our product delivers, our customers love us, and we're sitting at ~$5M ARR. Now, it's time to scale—and that means transforming how we ensure quality. We're evolving into an AI-first plan review platform, and we need testing infrastructure that can keep pace with rapid innovation while maintaining the reliability our government customers depend on.
We're hiring a Sr SDE in Test to own the quality automation strategy and execution across our platform. You will be a player-coach: spending 70% of your time building and maintaining automated test frameworks and test suites, and 30% defining testing strategy, mentoring other SDETs, and pioneering AI-powered testing approaches. You'll establish what "quality excellence" means for a fast-scaling Gov Tech SaaS company, build robust automation that gives the team confidence to ship quickly, and lay the foundation for a world-class quality engineering function.
This is a hands‑on, high-impact role where testing expertise meets AI innovation and Dev Ops culture. You'll work in a fully remote, agile environment with a modern stack (Angular, Type Script, Node.js, REST/Graph
QL, MySQL/Postgre
SQL, Kubernetes on AWS) where everyone is hands‑on in AWS. You'll need to balance comprehensive test coverage with speed, bring automation discipline to a growing team, and help us leverage AI to revolutionize how we build and execute tests. Perfect for an experienced quality engineer who thrives on building robust systems, loves exploring cutting‑edge testing tools and AI capabilities, and gets energized by shipping reliable software faster.
What Success Looks Like Objective #1:
Assess, stabilize, and establish foundations (First 30 days)
- Conduct comprehensive assessment of existing test automation: review current frameworks, test coverage, execution patterns, flakiness issues, and gaps in the testing pyramid (unit, integration, E2E)
- Interview developers, product managers, and engineering leadership to understand pain points with current quality processes, release confidence levels, and where bugs are slipping through
- Set up your development environment and gain hands‑on familiarity with the platform architecture, key user workflows, and critical integration points that require test coverage
- Review the last 3‑6 months of production incidents and support tickets to identify patterns in what types of defects are escaping to production and where testing could have caught them
- Establish a quick win by creating an automated test suite to test the top priority scenarios. Present initial assessment to leadership: current state of test automation, biggest quality risks, your recommended approach for balancing immediate stabilization with long‑term strategy, and proposed priorities for the first 90 days
Build the testing foundation and achieve velocity (By Month
3)
- Design and implement a comprehensive Playwright‑based E2E testing framework with clear patterns for test organization, page object models, test data management, and environment configuration
- Establish testing standards and best practices: how tests should be written, naming conventions, assertion patterns, and guidelines for when to write unit vs integration vs E2E tests
- Build a core library of automated tests covering critical user journeys (plan submission, review workflows, approval processes) and key integration points, achieving measurable improvement in test coverage metrics
- Integrate automated tests into the CI/CD pipeline with clear quality gates, ensuring tests run on every PR and preventing regressions from reaching production
- Create test automation documentation including framework architecture, how to write and run tests, troubleshooting…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).