×
Register Here to Apply for Jobs or Post Jobs. X

Data Platform Engineer

Job in Austin, Travis County, Texas, 78716, USA
Listing for: Wild Ducks
Full Time position
Listed on 2026-02-16
Job specializations:
  • IT/Tech
    Data Engineer, Data Science Manager
Salary/Wage Range or Industry Benchmark: 80000 - 100000 USD Yearly USD 80000.00 100000.00 YEAR
Job Description & How to Apply Below

Wild Ducks is building a real-time, event-driven forecasting platform with complex multi-tenant datasets, high-throughput ingestion, and strict SLA and correctness requirements. We’re looking for an engineer to own our data architecture end-to-end: modeling, schema evolution, performance, partitioning, RLS, observability, and alignment with our event system and forecasting engine.

This role sits at the intersection of database engineering, data modeling, platform reliability, and product architecture. If you’re motivated by building durable data systems that scale, stay clean over time, and power critical forecasting logic, you’ll thrive here.

What You’ll Own
  • Own the structure, evolution, and quality of our Postgres schemas across all domains (forecasting, demand ingestion, eventing, TSL, geolocation, etc.).
  • Design high-integrity, multi-tenant-safe data models using UUID PKs, real foreign keys, RLS, partitioning, and row-level auditability.
  • Work with engineering leadership to shape the canonical data model behind demand, installed base, forecasting, TSL, and analytics workloads.
  • Ensure schemas align with business in variants and support future growth without fragmentation.
Performance, Reliability & Partitioning
  • Design and maintain table partitioning strategies (time-based and tenant-based) for high-throughput workloads.
  • Diagnose and resolve performance bottlenecks: query tuning, index planning, materialized views, caching layers.
  • Build and maintain observability around database health, query performance, storage growth, and index efficiency.
  • Partner with ingestion, forecasting, and orchestration teams to ensure all pipelines read/write safely with proper RLS, locking discipline, and idempotency.
  • Improve and harden cross-service data flows (Django ORM, SQL Alchemy, Temporal activities, Pulsar event persistence).
  • Maintain the event → DB → projection/read-model lifecycle and ensure correctness across domains.
Governance, Integrity & Tooling
  • Define and enforce our schema migration standards (zero-downtime migrations, NOT VALID → VALID foreign keys, safe column evolution).
  • Build internal tools that keep the data platform clean (schema diffing, RLS linters, migration validation pipelines).
  • Partner with the CTO to establish data governance, retention, backup, and archiving policies.
Strategic & Cross-Functional Work
  • Work closely with product and engineering teams to design data models that match domain needs (parts, , routing, forecasting, TSL).
  • Support analytics and insights teams by defining readable, durable read models.
  • Participate in architectural reviews and major feature design to ensure the data layer is scalable and future-proof.
Who You Are – Core Traits
  • You think in systems, in variants, and lifecycle, not just tables.
  • You are obsessive about correctness, consistency, and clarity in data structures.
  • You communicate clearly and collaboratively with engineers across backend, eventing, forecasting, and Dev Ops.
  • You enjoy both designing new models and untangling old ones.
Experience That Helps
  • Deep familiarity with Postgres: querying, indexing, performance tuning, partitioning, RLS, WAL, explain plans.
  • Experience designing multi-tenant schemas with strong security boundaries.
  • Knowledge of event-driven systems and how data behaves around Pulsar/Kafka, outbox patterns, and projection models.
  • Comfort with Python (Django, SQL Alchemy) or equivalent backend frameworks.
  • Experience building or evolving data architectures for SaaS, logistics, or forecasting platforms.
  • Experience supporting high-throughput ingestion systems.
  • Experience designing data warehouse adjacencies (future Click House, OLAP layers) is a plus.
  • Bonus: familiarity with forecasting concepts (demand, installed base, TSL, planning nodes) or willingness to learn.
What You’ll Work With
  • Postgres 14+ with multi-tenant RLS
  • Django ORM
    , custom SQL, Temporal workflows
  • Redis
    , caching layers, materialized views
  • GKE Autopilot
    , Terraform-managed infra
  • Tools for schema evolution (Sqitch, Alembic-like tooling, Django migrations)
How We Work
  • Hybrid: remote-friendly with periodic in-person design sessions.
  • No silos: you work across forecasting, ingestion, eventing, and infra.
  • High trust, high ownership: your work defines the foundation of our system for years ahead.
  • Tight collaboration with platform, data science, and product teams.
#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary