Senior Data Integration Engineer
Listed on 2026-02-14
-
IT/Tech
Data Engineer, Cloud Computing
An exciting career awaits you
At MPC, we’re committed to being a great place to work – one that welcomes new ideas, encourages diverse perspectives, develops our people, and fosters a collaborative team environment.
Position Summary:The Senior Data Integration Engineer is responsible for Integration developments using the latest S4 HANA, BTP and cloud related tools/technology, and provide SAP Integration architecture across the organization, championing innovation, continuous improvement and automation. This role will be focusing on leading integration development, modernization, and ensuring customized solutions are resilient and follow MPC standards and best practices. The ideal candidate possesses a strong background in application security best practices.
Additionally excellent leadership and strong communication skills are essential.
- Leads integration projects across different departments or systems.
- Collaborates closely with cloud engineers for cloud-based integrations.
- Collaborates with IT security to ensure data protection during integrations.
- Handles complex API integrations with third-party systems.
- Undertakes the automation of routine and repetitive data tasks.
- Advocates for and upholds data quality standards in integrations.
- Develops advanced data transformation and cleansing strategies and mentors less experienced team members in best practices.
- Participates in vendor selection and contributes to strategic decisions regarding integration tools and platforms.
- Ensures high availability and fault tolerance in integration processes.
- Develop and maintain integration solutions using SAP PO, SAP Integration Suite, and Business Objects Data Services.
- Design and implement data integration workflows and ETL processes to ensure seamless data transformation and movement.
- Optimize data warehousing, data lakes, and data pipelines to support robust data storage and retrieval.
- Leverage cloud-based services (AWS, Azure, GCP) and integration technologies (e.g., REST APIs, SOAP) for efficient data integration.
- Utilize strong knowledge of relational and non-relational databases (e.g., MySQL, Postgre
SQL, Mongo
DB, Cassandra) to support diverse data needs. - Work with various data formats such as JSON, XML, Avro, and Parquet to ensure compatibility and efficiency in data handling.
- Bachelor’s degree in information technology, related field required.
- 5+ years of relevant experience required.
- Experience with SAP Process Orchestration (PO) and SAP Integration Suite is required.
- Experience with API or Dev Sec Ops CI/CD pipeline integration is preferred.
- Experience migrating interfaces from SAP PO to Integration Suite and defining/executing an SAP Clean Core Strategy for Integrations is preferred.
- Experience with ETL processes and data transformation with Data Sphere is preferred.
- API Development - Proficiency in integrating machine learning models into data pipelines and data platforms, including feature engineering, model deployment, and monitoring.
- Automations - Automations refer to the systematic use of software tools, scripts, and processes to streamline and optimize the management, processing, and analysis of data. These automations aim to reduce manual intervention, minimize errors, and increase efficiency in handling various data- related tasks such as data ingestion, transformation, cleansing, integration, storage, and reporting. Automations in data engineering empower organizations to handle large volumes of data efficiently, reduce operational overhead, and accelerate the delivery of insights and analytics to stakeholders.
- Containerization - Containerization is form of operating system virtualization, through which applications are run in isolated user spaces called containers, all using the same shared operating system (OS). Container orchestration automatically provisions, deploys, scales, and manages containerized applications without worrying about the underlying infrastructure.
- Data Integration - Proficiency in integrating data from various sources, including structured and unstructured data, using technologies such as ETL (Extract, Transform, Load) processes, data pipelines, and data ingestion frameworks.
- Data Pipelines - Data pipelines are a set of processes that enable the flow of data from one or multiple sources to a destination, often involving tasks such as extraction, transformation, and loading (ETL). These pipelines are designed to efficiently and reliably move and process data, ensuring its quality and accessibility for various analytical and operational purposes.
- Data Privacy - Ability to understand and implement practices that ensure the protection and confidential handling of personal and sensitive information. This includes knowledge of relevant laws and regulations (such as GDPR or HIPAA), the ability to design and enforce policies that safeguard data, and the skills to manage data access rights and consent protocols.
- Data Security - Knowledge of data privacy…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).