×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Data Engineer

Job in Southfield, Oakland County, Michigan, 48033, USA
Listing for: thyssenkrupp Materials NA, Inc.
Full Time, Part Time position
Listed on 2026-02-16
Job specializations:
  • Software Development
    Data Engineer
Job Description & How to Apply Below

Company With around 480 locations in over 40 countries, thyssenkrupp Materials Services is the biggest materials distributor and service provider in the western world. The broad service spectrum offered by the materials experts enables customers to focus on their individual core business. The area of Materials Services spans two strategic areas: global materials distribution as one-stop-shop - from steel and stainless steel, tubes and pipes, nonferrous metals and specialty materials to plastics and raw materials - and tailored services in the areas of materials management and supply chain management.

An extensive omnichannel architecture offers 250,000 customers worldwide cross-channel, round-the-clock access to more than 150,000 products and services. A highly efficient logistics system ensures that all requested services are smoothly integrated into customer production processes "just-in-time" or "just-in-sequence. thyssenkrupp Aerospace is a subsidiary of the internationally operating thyssenkrupp Group. We focus on supplying aerospace raw materials and finished part logistics to the world's leading aerospace companies and their supply chains.

Operating from 35 service centers in 15 countries throughout the Americas, Europe and Asia Pacific enables us to aggregate demand across a single supply chain on behalf of the world's leading OEM's, while at the same time providing a truly responsive local service to meet the needs of individual subcontractors anywhere in the world.

Your responsibilities

Job Summary The Data Engineer will be at the forefront of accelerating the modernization of the business unit's data ecosystem by building and supporting the Data Platform. The data engineering responsibilities will include building and maintaining the data platform and data products that allow for the seamless flow, availability, and reliability of data.

Job Description

Key Accountabilities:

  • Develop, maintain, and monitor data ingestion and enrich ETL/ELT pipelines within the platform that load and convert raw data into data products.

  • Partner with regional and/or global IT infrastructure teams to support and configure data platform storage and compute layers.

  • Maintain and build CI/CD pipeline code and automated test plans to ensure automated deployment between development and production environments.

  • Manage data platforms related to ITSM ticketing processes (incident & change requests).

  • Collaborate with data team members, architects, data stewards, data owners, and business SMEs to develop data product business requirements for data cleansing and enrichment.

  • Implement data security and data governance policies within the data platform to protect sensitive information and maintain data quality.

  • Partner with cybersecurity and compliance teams to rigorously ensure compliance with applicable data security, data protection, and regulation requirements.

  • Partner with global data teams to ensure that the local data platform & products maintain interoperability.

  • Continuously identify and drive opportunities to improve platform performance, reduce complexity and technical debt and reduce cloud computing and storage costs.

  • Maintain support documentation within the team repository.

  • Escalate data platform issues to the attention of management / appropriate partners.

  • Leverage agile frameworks and Azure Dev Ops to execute the team backlog.

  • Implement data platform metadata management standards and policies.

Qualifications:

  • Bachelor's degree, or equivalent work experience.

  • Minimum 1+ years of experience in IT Industry and 1 year of experience as a Data Engineer or equivalent work experience.

  • At least 1 year of prior data engineering experience of building ETL/ELT data pipelines using Airflow, Fivetran, Qlik, DBT, ADF, Snowpipe, Matillion and/or similar tools to load and transform structure and semi-structured data.

  • At least 1 year of experience building and supporting cloud native databases, warehouses data lakes, lake houses, and/or data repositories.

  • At least 1 year of experience writing python and/or SQL to manipulate, transform, and load large disparate data sets.

  • Advanced working SQL knowledge and prior experience loading data…

To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary