Data Engineer — Analytics & Migration Validation; Hands-On SQL
Listed on 2026-02-16
-
IT/Tech
Data Engineer, Data Analyst
Remote/Hybrid 4 - 7+ years Hands-On SQL About Building Blocks
Regardless of whether accomplishing astounding customer work or aiding assemble our organization from the inside, our different gifted organization shares a solitary mission: to fabricate lovely encounters that decidedly affect individuals’ lives and the organizations we serve.
Our way of life is more reason than work, mixing the quest for certified human association with a fastidious way to deal with the things we make. All with an idealistic soul. We’re an organization that is adequately youthful to be fun and defiant – yet intense and driven enough to immensely affect the business.
Role OverviewWe’re looking for a hands-on Data Engineer to own the data layer of customer go-lives — ensuring migrations are validated, analytics pipelines are hardened, and business dashboards are powered by accurate, performant data. You’ll be responsible for validating and signing off on end-to-end data migrations, building high-quality SQL models, and implementing automated data quality checks to catch issues early.
This is a highl technical and impact-driven role focused on migration testing, SQL performance tuning, and data quality automation — aligning with AWS and industry best practices.
Key Responsibilities- End-to-End Migration Validation: Design and execute functional and performance validation for data migrations — including parity, nullability, PK/FK, duplication, and sampling checks — with complete documentation and sign-off aligned to AWS migration testing guidelines.
- Advanced SQL Development: Write and optimize analytical SQL (CTEs, window functions, incremental loads). Use EXPLAIN plans to tune query performance and ensure indexes and statistics support BI workloads.
- Automated Data Quality Frameworks: Implement and maintain data validation frameworks using Great Expectations, Deequ, or similar tools. Automate validation and publish Data Docs to ensure transparency across teams.
- Modeling & Documentation (dbt): If using dbt, build models with tests, exposures, and documentation to ensure traceability between dashboards and upstream data sources.
- Orchestration & Reliability: Productionize data validation and transformation jobs within Airflow DAGs, ensuring well defined SLAs, alerts, and reliable pipeline operations.
- (Optional) Cloud Data Engineering: Build incremental pipelines and optimize batch processing for Snowflake (Streams & Tasks) or PostgreSQL
, ensuring performance and cost efficiency.
- Experience: 4–7+ years as a Data Engineer or Analytics Engineer.
- SQL Expertise: Advanced proficiency in SQL and strong RDBMS fundamentals (Postgre
SQL required), with proven experience in query tuning using EXPLAIN/analyze. - Migration Validation: Hands-on experience designing and executing data migration validation (parity, integrity, and performance testing).
- Tooling Knowledge:
Experience with one or more of the following —
dbt, Great Expectations or Deequ/PyDeequ, Airflow. - Version Control: Comfortable with Git-based workflows and CI/CD integration.
- Experience with Snowflake (Streams, Tasks, cost optimization, and warehouse tuning).
- Exposure to BI tools such as Looker, Power BI, Tableau, or Metabase
. - Working knowledge of Python for lightweight data transformations and validation frameworks.
Please select the services you are looking for
#J-18808-Ljbffr(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).