×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

OM Bank - Data Engineer & Stream Lead

Job in Cape Town, 7100, South Africa
Listing for: Old Mutual
Full Time position
Listed on 2026-02-22
Job specializations:
  • IT/Tech
    Data Engineer
  • Engineering
    Data Engineer
Job Description & How to Apply Below

Description

At OM Bank, we strive to attract great people who are passionate about coming together for a higher purpose- building something unique and aspirational, always aiming to be the best they can be. We are rooted in our purpose of inspiring and enabling our customers to grow and sustain their prosperity.

Develop data products & data warehouse solutions in on-premises and cloud environments using cloud-based services, platforms and technologies. Dynamic and results-driven Lead Stream & Data Engineer with extensive experience in designing, developing, and deploying high-performance, re-usable streaming applications using Apache Kafka, Apache Flink and Java and batch processing pipelines using Python, Dbt and Apache Airflow.

Proven expertise in building scalable data pipelines and real-time processing systems that enhance operational efficiency and drive business insights. Strong background in microservices architecture, cloud technologies, and agile methodologies.

KEY RESULT AREAS

Operational Delivery

  • Assist with clarification of technical requirements & implementation process with PO’s
  • Assist data engineering team with product and architectural knowledge
  • Drive good technical architecture
  • Support and guidance of the tech team with resolution of PO tickets
  • Support and implementation of technical strategy
  • Ensuring data is available to businesses in a secure, actionable and reliable way;
  • Raising the bar on data quality, data governance, reliability and engineering excellence
  • Preparing the data sets that will ultimately deliver valuable insights to the business
  • Enabling the discovery and exploration by analysts, BI, data science and business users
  • Identifying opportunities for improvements and putting energy behind turning them into action
  • Technical Skills

  • Programming

    Languages:

    Proficient in Java (Java SE 8/11), with a solid understanding of object-oriented programming principles and Python 3
  • Streaming Technologies: Extensive experience with Apache Kafka, including Kafka Streams API for real-time data processing, producer/consumer development, and stream management and expertise in Apache Flink.
  • Desirable skills
    :
    Decodable and K8’s, Confluent.
  • Frameworks: Deep knowledge of Spring Boot for building RESTful services and microservices architectures; adept at using Spring Cloud for distributed systems.
  • Database Management: Skilled in integrating various databases (e.g., No

    SQL ) with streaming applications to ensure efficient data storage and retrieval.
  • Cloud Platforms: Hands-on experience deploying applications on AWS, utilizing services such as EC2, S3, RDS, and Lambda for serverless architectures.
  • Technical Leadership

  • Participate in the engineering and other discipline’s community of practice
  • Technical leadership and mentorship
  • Developing and monitoring of data engineering standards and principles
  • Lead technical delivery within teams and provide oversight of solutions
  • Share AWS knowledge and practical experience with community
  • Challenge and contribute to development of architectural principals and patterns
  • Delivery Management

  • Follow and participate in defined ways of work including, but not limited to, sprint planning,
  • backlog grooming, retrospectives, demos and PI planning
  • ROLE REQUIREMENT

  • Bachelor’s Degree in Computer Science or similar fields like Information Systems, Big Data, etc.
  • AWS Data Engineer Certification would be advantageous
  • Related Technical certifications
  • At least 5 - 8 years' experience with designing and developing Data Pipelines for Data Ingestion or Transformation using AWS technologies
  • Experience of developing solutions in the cloud
  • Experience in developing data warehouses and data marts
  • Experience in Data Vault and Dimensional modelling techniques
  • Experience working in a high availability Data Ops environment
  • Proficiency in AWS services related to data engineering, such as AWS Glue, Athena, and EMR. Strong programming skills in languages like Python and Java.
  • Overseeing the implementation of CI/CD pipelines
  • Github
  • Designing and implementation of scalable streaming architectures using technologies such as Apache Kafka, Apache Flink, or AWS Kinesis to handle real-time data ingestion and processing.
  • Orchestration with Apache Airflow
  • Dbt core
  • Skills

    Action Planning, Cloud Computing, Cloud Infrastructure Management, Computer Network Security, Current State Assessment, Database Queries, Data Classification, Data Compilation, Data Compression, Data Encoding, Data Modeling, IT Architecture, IT Network Security, Test Case Management, Wireless Network Management

    Competencies

    Balances Stakeholders Business Insight Courage Cultivates  Innovation Drives Results Ensures Accountability Manages Complexity Optimizes Work Processes

    Education

    Bachelors Degree (B)

    Closing Date

    23 February 2026 , 23:59
    Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
    To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
     
     
     
    Search for further Jobs Here:
    (Try combinations for better Results! Or enter less keywords for broader Results)
    Location
    Increase/decrease your Search Radius (miles)

    Job Posting Language
    Employment Category
    Education (minimum level)
    Filters
    Education Level
    Experience Level (years)
    Posted in last:
    Salary