×
Register Here to Apply for Jobs or Post Jobs. X

Integration Engineer - Platform Engineer, Time Event Streaming

Job in Chicago, Cook County, Illinois, 60290, USA
Listing for: McDonald's Corporation
Full Time position
Listed on 2026-02-16
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Salary/Wage Range or Industry Benchmark: 100000 - 125000 USD Yearly USD 100000.00 125000.00 YEAR
Job Description & How to Apply Below
Position: Integration Engineer - Platform Engineer, Real Time Event Streaming

Company

Description:

McDonald’s growth strategy, Accelerating the Arches, encompasses all aspects of our business as the leading global omni-channel restaurant brand. As the consumer landscape shifts we are using our competitive advantages to further strengthen our brand. One of our core growth strategies is to Double Down on the 3

Ds (Delivery, Digital and Drive Thru). McDonald’s will accelerate technology innovation so 65M+ customers a day will experience a fast, easy experience, whether at one of our 25,000 and growing Drive thrus, through McDelivery, dine-in or takeaway.

McDonald’s Global Technology is here to power tomorrow’s feel-good moments.

That’s why you’ll find us at the forefront of transformative technology, exploring new and innovative ways to serve our millions of customers and spread happiness one delicious Hot Fudge Sundae-dipped fry at a time. Using AI, robotics and emerging tech, we’re digitizing the Golden Arches. Combine that with our unparalleled global scale, and we’re reshaping all areas of the business, industry and every community that is home to a McDonald’s restaurant.

We face complex tech challenges every day. But that’s where our diverse and talented teams come in. They’re made up of the best and brightest from all over the globe, and they thrive in the space where feel-good meets fast-paced.

Check out the McDonald’s Global Technology Technical Blog to learn how technology and our global team are directly enabling the Accelerating the Arches strategy.

Department Overview

We are seeking a highly skilled Integration Platform Engineer part of our real time event streaming team to join our growing Data& Integration Platform Engineering team. This role will be pivotal in designing, building, and supporting our next-generation real-time event streaming and processing platform. You will work closely with data engineers, Dev Ops, and product teams to ensure scalable, reliable, and secure integration patterns for streaming data pipelines that power critical business applications.

Responsibilities
  • Design, implement, and operate real-time data streaming pipelines using Apache Kafka and Confluent Cloud, AWS MSK, Google Cloud Pub/Sub or similar technologies.
  • Implement and optimize deployment processes for real-time streaming platforms, ensuring efficient, reliable data flow
  • Develop and maintain Kafka topics, schemas, producers, consumers, and stream processing applications (e.g., ksql

    DB, Kafka Streams, Flink).
  • Build event-driven integration patterns that serve both internal systems and customer-facing APIs.
  • Collaborate with application teams to define event schemas and implement schema registry best practices.
  • Implement robust monitoring, alerting, and observability for Kafka infrastructure and data flows to ensure platform stability and promptly address performance issues and incidents
  • Partner with cloud, security, and data governance teams to enforce RBAC, data retention, encryption, and ensure compliance with regulatory standards..
  • Automate platform provisioning and configuration using Terraform, CI/CD pipelines, and Infrastructure as Code (IaC) best practices.
Additional Responsibilities
  • Identify and address performance bottlenecks to improve system reliability, scalability, and responsiveness
  • Work closely with data engineering, Dev Ops, and analytics teams to align streaming solutions with business needs
  • Document infrastructure setups, configurations, and fixing guides to improve team knowledge and consistency
  • Design, build and maintain SDK for event streaming platform.
  • Establish and build an application onboarding process and framework for event streaming
  • Support MFT and API management platforms
Qualifications
  • 5+ years of experience in data engineering, platform engineering, or integration engineering roles.
  • Strong expertise in Apache Kafka ecosystem, with production experience using Confluent Cloud or Platform experience with real-time streaming platforms (e.g., Apache Kafka, AWS Kinesis, Google Pub/Sub) and event-driven architectures
  • Experience designing scalable event-driven architectures and real-time data ingestion solutions.
  • Proficiency in Terraform, CI/CD pipelines (e.g., Git Lab,…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary