Engineer II; Big Data Engineering
Listed on 2025-12-07
-
IT/Tech
Data Engineer, Big Data
Overview
Engineer II (Big Data Engineering)
New York City, NY
Boston, MA
Los Angeles, CA
Broomfield, CO
Hybrid Schedule (M/F remote, T/W/TH in-office)
At Magnite, we cultivate an environment of continuous growth and collaboration. Our work impacts what millions of people read, watch, and buy, and we’re looking for people to help us tackle that responsibility with creativity and focus. Magnite (NASDAQ: MGNI) is the world’s largest independent sell-side advertising platform. Publishers use our technology to monetize their content across all screens and formats including CTV / streaming, online video, display, and audio.
Our tech fuels billions of transactions per day!
Are you excited about high-performance Big Data implementation? Then great! Magnite is growing, and we need software developers who are thorough and agile, capable of breaking down and solving problems and have a strong will to get things done. In the DV+ Data Engineering team you will work on real-world problems working on big data stack where accuracy and speed are paramount, take responsibility for your systems end-to-end and influence the direction of our technology that impacts customers around the world.
About this team:
We own the data systems that process hundreds of billions of events per day for the DV+ platform. As such, we are looking for a Data- or Software-Engineer with both conceptual and hands-on experience working on big data development. This is a fully integrated environment that includes upstream data ingestion processes, proprietary and Open Source DBMS as well as large scale (full cycle multi-billion rows) data warehouse environment.
As a member of our data engineering team you will be a part of a service group responsible for continuing organizational expansion of our data platform. Ideal candidates must be excited about all aspects of big data development including data transport, data processing, data warehouse/ETL integration, quick learning, and self-starting. This is a demanding role that will require hands-on experience with big data processing development on Linux.
You will be responsible for the day to day operation and new developments. We are seeking a candidate with good skills in software development life cycle, building data services with Java, Scala, script languages like Python etc. We are responsible for technological and operational excellence across our domain.
What you will be doing
- Design, develop and support various big data platforms applications including Hadoop, Kafka, ETL, and data warehouse applications integration
- The main responsibility is to develop applications with Java and Spark programming languages and Big Data technology; including scripting languages (Python, shell etc.) to support application execution.
- Involved in the design and implementation of full cycle of data services, from data transportation, data processing, ETL to data delivery for reporting
- Strong data analysis and troubleshooting to support day-to-day production operations
- Ability to proactively identify, troubleshoot and resolve production data and performance issues
What we are looking for
- Proficiency and hands-on experience with data engineering related technologies:
Hadoop /Spark / Kafka / Druid etc. Java and script languages (UNIX Shell, Perl, Python etc.) - Familiar with process, infra, and application management technologies – JIRA, Jenkins, Github, etc.
- Ability to follow standard development and engineering practices and understand basic to complex concepts related to computer architecture, data structures, and programming practices
- Experience developing and executing testing/debugging, data quality, and performance tuning applications
- Ability to communicate effectively to end users and work within a team environment
- Bachelor’s degree in CS/EE or related science
Nice To Have
- Experience working with the ad-tech industry
- Experience working on Massively Parallel Processing architecture
- Experience on MapR Hadoop
- Experience with SQL in association with data transformation, custom reporting & analysis, and data investigation desired
Perks and Benefits :
- Comprehensive Healthcare Coverage from Day One
- Generous Time Off
- Holiday Breaks and…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).