Data Engineer III
Listed on 2026-02-10
-
IT/Tech
Cloud Computing, Data Engineer
About The Role
It’s a great time to join us at ARC! ARC accelerates the growth of global air travel by delivering forward-looking travel data, flexible distribution services and other innovative industry solutions. We are a leading travel intelligence company with the world’s largest, most comprehensive global airline ticket dataset, including more than 15 billion passenger flights. By working here, you can contribute to solutions and expertise that strengthen economies and enrich lives.
We think big, embrace challenges and explore new ideas to lead the way for the travel industry.
ARC is looking for a Data Engineer III to join our team! In this role, you will p articipate in a flexible agile environment to provide software development and product delivery support for an array of data products utilizing the world’s most comprehensive and accurate set of airline ticketing data. You will be responsible for product delivery data pipelines and product lifecycle with direction from the Product Owner, the Solution Owner and technical guidance from Solution Architects.
You'll work with the most current cloud technologies to explore and innovate new and better ways of providing first‑class data to a growing customer base. Additionally, you'll ensure the highest quality products and timely delivery, development non‑functional requirements such as SLA/SLO/SLI, operational support, metrics, alerts, and notifications, and optimizing efficiency.
- Support data pipeline development by contributing to and/or leveraging existing architectural patterns to deliver the optimal data products; establish and communicate design patterns, automation, recovery, and operational run books. Collaborate cross functionally to provide overall technical and thought leadership to other teams as needed.
- Partner with product owners and business SMEs to analyze the business need and provide a supportable and sustainable engineered solution. Ensure that the overall technical solution is aligned with the business needs and adheres to ARC’s Architectural Guiding Principles and the AWS Well‑Architected Framework.
- Help drive the creation and modifications of the product portfolio components, identify, and engage all technical resources necessary to contribute to the solution. Ensure the solution is consistent with ARC architecture, design and development standards.
- Ensure databases, data pipelines and data platforms are functioning, configured appropriately, and operating efficiently for all ARC products, systems, and services. Ensure appropriate logging, monitoring, metrics gathering, and alerts are configured and easily consumable by operations support and product owners and managers.
- Stay current with latest cloud technologies, patterns, and methodologies; share knowledge by clearly articulating results and ideas to key stakeholders. May be required to present ideas to larger audience for review and buy‑in.
- Bachelor’s degree in computer science or related field; or equivalent experience
- 3+ years of cloud database development and/or administration, SQL
- 3+ years of experience with full cycle application development (Full SDLC experience: design, development, delivery, etc.),
- 3+ years with Agile, Scrum, Dev Ops, and Continuous Integration and Continuous Delivery
- 3+ years of experience implementing modern applications using:
- Data warehouse platforms (such as Snowflake, and Redshift). Expertise with SQL, database design/structures, ETL/ELT design patterns, and datamart structures (star, snowflake schemas, etc.).
- Cloud Based Solutions/Technologies (AWS, Google, Azure). AWS tech stack including, but not limited to, Lambda, API Gateway, Dynamo
DB, S3, Cloudwatch, SNS/SQS, Step Functions, Fargate - Implementation of modern application and infrastructure design patterns, including micro-services and containers, disposable, reactive, stateless and distributed patterns
- Open source technologies including, but not limited to, Python, NodeJS, OpenJDK, React, and No
SQL, Dynamo
DB database(s) - Familiarity with Dev Ops tools including, but not limited to, Terraform/Cloud formation, CI/CD pipeline tool like Gitlab,…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).