Data Engineer II - QuantumBlack, AI McKinsey
Listed on 2025-12-19
-
IT/Tech
Data Engineer, Cloud Computing
McKinsey recruiting team will celebrate the holidays and there will be no online application support available starting December 19th at 5:00pm New York time. Regular support will resume on January 5th. For commonly asked questions, please refer to our FAQ page: Thank you, and happy holidays!
Data Engineer II – Quantum Black, AI by Mc KinseyJob
Your ImpactAs a Data Engineer II, you will design, build, and optimize modern data platforms that power advanced analytics and AI solutions. You will collaborate with clients and interdisciplinary teams to architect scalable pipelines, manage secure and compliant data environments, and unlock the value of complex datasets across industries. You will sharpen your expertise by working on innovative projects, contributing to R&D, and learning from top-tier talent in a dynamic, global environment.
Your work will drive lasting impact. By ensuring data is accurate, accessible, and production‑ready, you will enable clients to accelerate digital transformations, adopt AI responsibly, and achieve measurable business outcomes.
- Develop a streaming data platform to integrate telemetry for predictive maintenance in aerospace systems
- Implement secure data pipelines that reduce time‑to‑insight for a Fortune 500 utility company
- Optimize large‑scale batch and streaming workflows for a global financial services client, cutting infrastructure costs while improving performance
- Develop pipelines for embeddings and vector databases to enable retrieval‑augmented generation (RAG) for a global defense client
You will work in cross‑functional Agile teams with Data Scientists, Machine Learning Engineers, Designers, and domain experts to deliver high‑quality analytics solutions. Partnering closely with clients—from data owners to C‑level executives—you will shape data ecosystems that drive innovation and long‑term resilience.
Your GrowthDriving lasting impact and building long‑term capabilities with our clients is not easy work. You are the kind of person who thrives in a high performance/high reward culture—doing hard things, picking yourself up when you stumble, and having the resilience to try another way forward.
- Continuous learning: Our learning and apprenticeship culture, backed by structured programs, is all about helping you grow while creating an environment where feedback is clear, actionable, and focused on your development.
- A voice that matters: From day one, we value your ideas and contributions. You will make a tangible impact by offering innovative ideas and practical solutions, all while upholding our unwavering commitment to ethics and integrity.
- Global community: With colleagues across 65+ countries and over 100 different nationalities, our firm’s diversity fuels creativity and helps us come up with the best solutions for our clients. Plus, you will have the opportunity to learn from exceptional colleagues with diverse backgrounds and experiences.
- World‑class benefits: On top of a competitive salary (based on your location, experience, and skills), we provide a comprehensive benefits package to enable holistic well‑being for you and your family.
- Degree in Computer Science, Business Analytics, Engineering, Mathematics, or related field
- 2+ years of professional experience in data engineering, software engineering, or adjacent technical roles
- Proficiency in Python, Scala, or Java for production‑grade pipelines, with strong skills in SQL and Py Spark
- Hands‑on experience with cloud platforms such as (AWS, GCP, Azure, Oracle) and modern data storage/warehouse solutions such as Snowflake, Big Query, Redshift, and Delta Lake
- Practical experience with Databricks, AWS Glue, and transformation frameworks like dbt, Dataform, or Databricks Asset Bundles
- Knowledge of distributed systems such as (Spark, Dask, Flink) and streaming platforms (Kafka, Kinesis, Pulsar) for real‑time and batch processing
- Familiarity with workflow orchestration tools such as (Airflow, Dagster, Prefect), CI/CD for data workflows, and infrastructure‑as‑code (Terraform, Cloud Formation)
- Understanding of Data Ops principles including pipeline monitoring,…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).