×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineering Manager

Job in New York, New York County, New York, 10261, USA
Listing for: Angi
Full Time position
Listed on 2026-01-01
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Job Description & How to Apply Below
Location: New York

At Angi®, we’ve had one simple mission for 30 years: get jobs done well. We make it happen by connecting homeowners with reliable pros who have the skills they need — and connecting pros with homeowners who have the jobs they want.

Angi at a glance:

  • A new homeowner turns to Angi every second
  • Our network has 150,000+ skilled pros in 50+ service categories
  • + projects brought to Angi (and counting)

Angi® is defining the future of the home services industry, creating an environment where homeowners, service professionals and employees benefit from more jobs done well.

For homeowners, our platform makes it easier than ever to find a skilled pro for repairs, renovations, and everything in between. For pros, we  a business partner who helps them find the work they want and grow their business.

We believe home is the most important place on earth, and so we work hard for our homeowners and our pros — making a real impact on families, communities, and livelihoods.

Angi is an amazing place to call home. We can’t wait to welcome you.

About the team:

Angi is looking for a Data Engineering Manager with strong leadership, technical depth, and delivery experience to help shape the future of our global data platform. You’ll lead the Data Processing & Storage (DPS) team within the Data Platform Products domain under the Technical Foundations Zone - the group responsible for building, scaling, and evolving the core data infrastructure that powers analytics, experimentation, and machine learning across Angi.

What you’ll do:

  • Lead and mentor a team of experienced data engineers, supporting their growth, performance, and technical development.
  • Design and deliver large-scale, reliable, and cost-efficient data pipelines and platforms that power analytics, reporting, and data-driven products.
  • Own and evolve the core compute data infrastructure built on Trino, ensuring scalable, performant, and cost-optimized query processing across the data lakehouse environment.
  • Oversee major data migrations from our warehouse systems to modern, cloud-native lakehouse data platforms with minimal disruption.
  • Drive the evolution of our data lakehouse and warehouse ecosystems, optimizing storage, compute, and orchestration for performance and cost efficiency.
  • Collaborate closely with Product, Analytics, and Engineering teams to deliver data infrastructure that enables trustworthy and timely insights.
  • Champion best practices in data modeling, governance, and observability to ensure data quality, discoverability, and reliability across the organization.
  • Contribute to infrastructure and automation, leveraging tools like Terraform, Docker, Kubernetes, and CI/CD pipelines to ensure consistency, scalability, and resilience.
  • Stay technically engaged, participating in architectural design, reviewing implementations, and guiding the team through complex technical decisions.
  • Leverages AI tools (e.g. Copilot, Cursor, Claude Code) to improve coding speed and debug faster.

Who you are:

  • 8+ years of professional experience in data engineering or software engineering, with a deep understanding of distributed data systems and modern data architecture.
  • 2+ years of experience managing engineering teams, including performance management and leading senior engineers.
  • Proven experience building and operating large-scale, cloud-based data platforms, ideally in AWS (S3, Glue, IAM, CLI, etc.) or equivalent environments.
  • Hands-on experience with SQL and Python, and a strong understanding of ETL/ELT workflows and data lifecycle management.
  • Deep experience with data warehouse and data lakehouse technologies such as Snowflake, Redshift, Big Query, and Trino as the primary compute infrastructure.
  • Proficiency with workflow orchestration tools (Airflow, Dagster, Prefect, or Flyte) and data integration tools (Fivetran, Stitch, Airbyte, Meltano, or Glue).
  • Experienced with infrastructure-as-code (Terraform or Cloud Formation) and deployment automation.
  • Comfortable with Docker and Kubernetes, deploying and scaling data-intensive workloads in containerized environments.
  • Excellent communication and collaboration skills, able to work effectively across technical and business teams.
  • Preferred…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary