Data Systems Engineer & Data Architect
Listed on 2025-12-01
-
IT/Tech
Data Engineer, Cloud Computing, Data Warehousing
Career Opportunities with Depot Connect International
Share with friends or Subscribe!
Current job opportunities are posted here as they become available.
Discover a career at Depot Connect International (DCI), a global leader in the Tank/ISO Tank Container Services and Tank Trailer Parts industry. We're more than just a service provider; we're a unified team combining the expertise of industry leaders Quala, Boasso Global, and Polar Service Centers (PSC). Headquartered in Tampa, Florida, with over 160 locations worldwide, our team of over 3,500 employees excels in offering a multitude of mission-critical services.
This role combines the strategic oversight of a Data Architect with the practical, hands-on implementation skills of a Data Engineer
, focusing heavily on Master Data Management (MDM) and Application-to-Application (A2A) integrations. The ideal candidate will be responsible for designing, building, and optimizing the data infrastructure and pipelines that ensure data quality, consistency, and efficient exchange across the enterprise.
As the Data Architect, you'll be the visionary for the data landscape, particularly in MDM and integration domains.
- Define and govern the Master Data Management (MDM) strategy, roadmap, and architecture (e.g., Customer, Product, Vendor, etc.).
- Design the conceptual, logical, and physical data models for MDM solutions, ensuring data lineage and governance are clearly defined.
- Select and evaluate appropriate MDM technologies and tools.
- Design scalable and robust Application-to-Application (A2A) integration architectures using various patterns (e.g., API, messaging, ETL/ELT).
- Establish technical standards and best practices for data integration, security, and performance.
- Data Governance & Quality:
- Collaborate with Data Governance teams to define data standards, policies, and quality rules.
- Design systems for continuous data quality monitoring and remediation.
- Technology Leadership:
- Provide technical leadership and mentorship to data engineering teams.
- Drive adoption of modern data architecture principles (e.g., Data Mesh, Data Fabric).
As the Data Engineer, you'll be responsible for the hands‑on building and operationalization of data solutions.
- Design, build, and maintain robust, scalable, and efficient ETL/ELT pipelines to ingest, transform, and load data from various source systems into the MDM hub and analytical platforms.
- Develop and optimize data flows for real‑time and batch A2A integrations.
- MDM Implementation:
- Hands‑on configuration, development, and testing of MDM platform components (e.g., matching rules, survivorship logic, data quality checks).
- Develop custom connectors and services to synchronize master data across enterprise systems.
- Implement data solutions using cloud‑native services (e.g., AWS, Azure, GCP) including services like Kafka/Kinesis, Snowflake/Redshift/Big Query, Spark/Databricks, and managed database services.
- Write high‑quality, maintainable code in languages like Python or Scala
. - Automation & Monitoring:
- Implement CI/CD practices for data pipelines and infrastructure‑as‑code (IaC).
- Set up comprehensive monitoring and alerting for data quality, pipeline performance, and health integration.
Required Qualifications
- Education: Bachelor's or Master's degree in Computer Science, Engineering, Information Systems, or a related quantitative field.
- Experience: 5+ years of combined experience in Data Architecture and hands‑on Data Engineering roles.
- MDM Expertise: Deep, hands‑on experience with at least one major MDM platform (e.g., Informatica MDM, Boomi, Reltio, Semarchy, Talend MDM).
- Integration Proficiency: Strong background in designing and implementing A2A integrations using technologies like REST APIs, SOAP, Message Queues (e.g. Kafka, Rabbit
MQ), and Enterprise Service Buses (ESB). - Technical Stack:
- Expert in Python or Scala.
- Proficiency with SQL and relational/No
SQL databases. - Hands‑on experience with cloud data platforms (AWS, Azure, or GCP).
- Experience with ETL/ELT tools (e.g., Informatica Power Center, Talend, Fivetran, Data Build Tool - dbt).
- Soft Skills: Excellent analytical, problem‑solving, and communication skills, with the ability to articulate complex technical concepts to non‑technical stakeholders.
- Medical, Dental and Vision Insurance
- 401(k) with match
- Paid Time Off
- 10 Paid Holidays
- Short and Long Term Disability Insurance
- Tuition Reimbursement
Applicants must be authorized to work for ANY employer in the U.S. We are unable to sponsor or take over sponsorship of an employment Visa at this time
#J-18808-Ljbffr(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).