×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer

Job in Lisle, DuPage County, Illinois, 60532, USA
Listing for: Adtalem Global Education
Full Time position
Listed on 2025-12-01
Job specializations:
  • IT/Tech
    Data Engineer, Data Science Manager
Job Description & How to Apply Below

Company Description

Adtalem Global Education is a national leader in post‑secondary education and leading provider of professional talent to the healthcare industry. Adtalem educates and empowers students with the knowledge and skills to become leaders in their communities and make a lasting impact on public health, well‑being and beyond. Through equitable access to education, environments that nurture student success, and a focus on expanding and diversifying the talent pipeline in healthcare, Adtalem is building a brighter future for communities and the world.

Adtalem is the parent organization of American University of the Caribbean School of Medicine, Chamberlain University, Ross University School of Medicine, Ross University School of Veterinary Medicine and Walden University.

We operate on a hybrid schedule with four in‑office days per week (Monday–Thursday). This approach enhances creativity, innovation, communication, and relationship‑building, fostering a dynamic and collaborative work environment.

Visit  for more information and follow us on Linked In and Instagram.

Job Description

Adtalem is a data driven organization. The Data Engineering team builds data solutions that powers strategic and tactical business decisions and supports the Analytics and Artificial Intelligence operations. By implementing the data platform, data pipelines and data governance policies this team provides the basis for decision‑making in Adtalem. Adtalem is looking for a Senior Data Engineer who design, build, and maintain robust data engineering solutions that support our company’s innovation initiatives and growth objectives.

  • Architect, develop, and optimize scalable data pipelines handling real‑time, unstructured, and synthetic datasets
  • Collaborate with cross‑functional teams, including data scientists, analysts, and product owners, to deliver innovative data solutions that drive business growth.
  • Design, develop, deploy and support high performance data pipelines both inbound and outbound.
  • Model data platform by applying the business logic and building objects in the semantic layer of the data platform.
  • Leverage streaming technologies and cloud platforms to enable real‑time data processing and analytics
  • Optimize data pipelines for performance, scalability, and reliability.
  • Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products.
  • Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root.
  • Document the design and support strategy of the data pipelines
  • Capture, store and socialize data lineage and operational metadata
  • Troubleshoot and resolve data engineering issues as they arise.
  • Develop REST APIs to expose data to other teams within the company.
  • Stay current with emerging technologies and industry trends related to big data, streaming data, and synthetic data generation
  • Mentor and guide junior data engineers.
Qualifications
  • Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field. Two (2) plusyears experience in Google cloud with services like Big Query, Composer, GCS, Data Stream, Dataflows,BQML, Vertex AI.
  • Six (6) plusyears experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics.
  • Hands‑on experience working with real‑time, unstructured, and synthetic data, and will be instrumental in advancing our data platform capabilities.
  • Experience in Real Time Data ingestion using GCP Pub Sub, Kafka, Spark or similar.
  • Expert knowledge on Python programming and SQL.
  • Experience with cloud platforms (AWS, GCP, Azure) and their data services
  • Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed.
  • Familiarity with synthetic data generation and unstructured data processing
  • Experience in AI/ML data pipelines and frameworks
  • Excellent organizational, prioritization and analytical abilities.
  • Have proven experience…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary