More jobs:
Job Description & How to Apply Below
Purpose:
Build and maintain scalable, reliable data pipelines and analytics-ready models. This is a hands-on, customer-facing role requiring strong problem-solving, ownership, and the ability to work independently with US-based customers.
Key Responsibilities:
- Build ELT pipelines using Big Query OR Snowflake, dbt, and cloud services
- Write production-grade Python code for ingestion, transformation, and automation
- Design analytics-ready data models
- Work directly with customers to gather and clarify requirements
- Ensure data quality via testing, validation, and monitoring
- Debug pipeline failures, data issues, and performance bottlenecks
- Participate in code reviews and follow engineering best practices
- Work with 3–5 hours overlap with US EST/CST/PST
Required Qualifications:
- 2–4 years of data engineering or analytics engineering experience
- Strong SQL (joins, CTEs, window functions)
- Hands-on Big Query OR Snowflake
- Strong Python coding used in production pipelines
- Experience with dbt (models and tests)
- Prior customer-facing or stakeholder-facing experience
- Strong ownership and accountability mindset
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×