×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer

Job in Johannesburg, 2000, South Africa
Listing for: Nedbank
Full Time position
Listed on 2026-03-01
Job specializations:
  • IT/Tech
    Data Engineer, Big Data, Data Analyst, Data Warehousing
Job Description & How to Apply Below

Job Family

Information Technology

Career Stream

Data

Leadership Pipeline

Manager of Self Professional

Job Purpose

The purpose of the Data Engineer is to leverage their data expertise and data related technologies, in line with the Nedbank Data Architecture Roadmap, to advance technical thought leadership for the Enterprise, deliver fit for purpose data products, and support data initiatives. In addition, Data Engineers enhance the data infrastructure of the bank to enable advanced analytics, machine learning and artificial intelligence by providing clean, usable data to stakeholders.

They also create data pipelines, Ingestion, provisioning, streaming, self service, API and solutions around big data that support the Bank's strategy to become a data driven organisation.

Job Responsibilities

  • Responsible for SAS system administration, maintenance, improvement and application support on SAS platforms.
  • Responsible for the maintenance, improvement, cleaning, and manipulation of data in the bank's operational and analytics databases.
  • Data

    Infrastructure: Build and manage scalable, optimised, supported, tested, secure, and reliable data infrastructure e.g. using Infrastructure and Databases (DB2, Postgre

    SQL, MSSQL, HBase, No

    SQL, etc), Data Lakes Storage (Azure Data Lake Gen
    2), Cloud-based solutions (SAS , Azure Databricks, Azure Data Factory, HDInsight), Data Platforms (SAS, Azure Cloud). Ensure data security and privacy in collaboration with Information Security, CISO and Data Governance
  • Data Pipeline Build (Ingestion, Provisioning, Streaming and API):
    Build and maintain data pipelines to:
    • Create data pipelines for data integration (Data Ingestion, Data Provisioning and Data Streaming) utilising both On Premise tool sets and Cloud Data Engineering tool sets
    • Efficiently extract data (Data Acquisition) from Golden Sources, Trusted sources and Write backs with data integration from multiple sources, formats and structures
    • Provide data to the respective Lines of Business Marts, Regulatory Marts and Compliance Marts through self service data virtualisation
    • Provide data to applications or Nedbank Data consumers
    • Handle big data technologies and streaming (KAFKA)
    • Drive utilisation of data integration tools and Cloud data integration tools (Azure Data Factory and Azure Data Bricks)
  • Data Modelling and Schema Build:
    In collaboration with Data Modellers, create data models and database schemas on the Data Reservoir, Data Lake, Atomic Data Warehouse and Enterprise Data Marts.
  • Nedbank Data Warehouse Automation:
    Automate, monitor and improve the performance of data pipelines.
  • Collaboration:

    Collaborate with Data Analysts, Software Engineers, Data Modelers, Data Scientists Scrum Masers and Data Warehouse teams as part of a squad to contribute to the data architecture detail designs and take ownership of Epics end-to-end and ensure that data solutions deliver business value.
  • Data Quality and Data Governance:
    Ensure that reasonable data quality checks are implemented in the data pipelines to maintain a high level of data accuracy, consistency and security.
  • Performance and Optimisation:
    Ensure the performance of the Nedbank data warehouse, integration patterns, batch and real time jobs, streaming and API's.
  • API Development:
    Build API's that enable the Data Driven Organisation, ensuring that the data warehouse is optimised for API's by collaborating with Software Engineers.

Essential Qualifications - NQF Level

  • Matric / Grade 12 / National Senior Certificate
  • Advanced Diplomas/National 1st Degrees

Preferred Qualification

  • Field of Study: BSc, BEng, Bcom

Preferred Certifications

  • SAS certification
  • Exposure to Agile Methodologies
  • Exposure to Cloud technologies and Dev Ops

Minimum Experience Level

  • Total number of years of experience: 5+ years
  • Experienced at working independently within a squad and has the demonstrated knowledge and skills to deliver data outcomes without supervision.
  • Experience designing, building, and maintaining data warehouses and data lakes.
  • Experience with big data technologies such as Hadoop, Spark, and Hive.
  • Experience with programming languages such as Python, Java, and SQL.
  • Experience with relational databases and No

    SQL databases.
  • Experience with cloud computing platforms such as AWS, Azure, and GCP.
  • Experience with data visualization tools.
  • Result-driven, analytical creative thinker, with demonstrated ability for innovative problem solving.

Technical / Professional Knowledge

  • SAS
  • Cloud Data Engineering (Azure , AWS, Google)
  • Data Warehousing
  • Databases (Postgre

    SQL, MS SQL, IBM DB2, HBase, Mongo

    DB)
  • Programming (Python, Java, SQL)
  • Data Analysis and Data Modelling
  • Data Pipelines and ETL tools (SAS ETL)
  • Agile Delivery
  • Problem solving skills

Behavioural Competencies

  • Decision Making
  • Influencing
  • Communication
  • Innovation
  • Technical/Professional Knowledge and Skills
  • Building Partnerships
  • Continuous Learning

---------------------------------------------------------------------------------------


Please contact the Nedbank Recruiting Team at

Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary