Senior Data Analytics Engineer; m/f/d
Mission, Johnson County, Kansas, 66201, USA
Listed on 2026-02-12
-
Software Development
Data Engineer, Software Engineer
Intro
At Adsquare, our mission is driven by our core focus:
Passion – Solving complex challenges with great people, tech, and data.
Niche – Location Intelligence for Programmatic Advertisers.
Our core values are integral to everything we do:
Drive
:
We turn ambition into action.Resilience
:
We adapt, persevere, and grow stronger.No BS
:
We value honesty, transparency, and clear communication.Humble
:
We choose modesty over vanity and let results speak for themselves.Moral Compass
:
We do the right thing with fairness, integrity, and respect.
We seek candidates who not only bring top-tier technical expertise but also embody these values in every aspect of their work.
Your ProfileWe are looking for a Senior Data Analytics Engineer who approaches data with a software engineering mindset. You will join our Data Solutions squad to architect, build, and maintain production-grade data platforms.
This is not a Data Analyst role. While you will understand the business context, your primary focus is technical: designing scalable architectures, writing clean and testable Python/SQL code, automating deployments, and optimizing cloud infrastructure. You will act as a technical standard-bearer, ensuring our pipelines are reliable, cost‑effective, and maintainable.
We are looking for a candidate who has strong analytics engineering experience or has a strong background in backend development focused on data.
Key ResponsibilitiesEnd-to-End Pipeline Engineering
:
Design and deploy robust transformation pipelines for high-volume data (TB scale). You are responsible for the full lifecycle: ingestion, transformation,
testing (unit/integration), deployment, and monitoring.Architecture & Optimization
:
Take ownership of data modeling and architectural decisions. You will actively refactor legacy code to improve performance and reduce cloud compute costs (e.g., optimizing Athena/Snowflake/Redshift clustering or AWS Glue jobs).Software Engineering Best Practices
:
Elevate the team's technical standards by implementing CI/CD workflows, containerization (Docker), and rigorous automated testing. You ensure that "it works on my machine" is never an excuse.Data Quality & Observability
:
Instead of just building business dashboards, you build infrastructure monitoring. You will implement alerts and checks (e.g., dbt tests, Great Expectations) to catch data quality issues before they reach stakeholders.Technical Leadership
:
Mentor junior engineers through code reviews and technical planning. You will be the voice of engineering rigor within the squad.
4+ years of experience specifically in Analytics Engineering or Data Engineering.
Advanced Python proficiency beyond scripting: You write modular, object-oriented, and production-ready code. You use relevant libraries and framework for testing, and understand exception handling and logging.
Deep expertise in SQL & dbt: You don't just write queries; you build scalable data models (Jinja templating, macros, incremental strategies) and understand query execution plans to optimize performance.
Software Engineering Fundamentals: Proven experience with Git flows, CI/CD pipelines (e.g., Git Hub Actions, Git Lab CI), and Containerization (Docker). You understand how to deploy code reliably.
AWS Cloud Native
Experience:
Hands‑on experience building serverless architectures using AWS Lambda, Step Functions, Glue, and Athena
.Testing Mindset: You can demonstrate experience implementing Unit Tests and Integration Tests for data pipelines. You do not rely on manual checks or stakeholder validation for quality assurance.
Data Warehouse Ops: Deep understanding of warehousing architecture (Snowflake, Redshift, or Big Query), including partitioning, clustering, and cost governance.
- Experience with Infrastructure as Code (Terraform) to manage cloud resources.
- Experience with orchestration tools like Airflow, Dagster, or Prefect
. - Knowledge of big data processing frameworks (Spark/PySpark).
- Familiarity with dashboarding tools (Streamlit, Preset, Tableau e.t.c) –
Note:
This is helpful for debugging and monitoring, but not the core function of the role. - B.S. or M.S. in Computer Science, Engineering, Mathematics or other relevant fields.
On top of a competitive package…
- We are open to flexible work models:we work on a hybrid mode and remotely from anywhere in the world up to 3 months per year
- To encourage education and professional growth, we offer an individual yearly budget of 1.200€
- You are entitled to 30 vacation days per year
- We offer Urban Sports Club membership, company pension scheme
- Regular team events and company events organised by our People team (Trust us, they know how to throw a party!)
- We equip you with the latest hardware and provide you with all the tools you need to thrive
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).