×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer; Fintech

Job in Cape Town, 7100, South Africa
Listing for: Black Pen Recruitment
Full Time position
Listed on 2026-01-05
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Job Description & How to Apply Below
Position: Data Engineer (Fintech)

Our client is a technology company solving payments problems for businesses. Their mission is to help businesses in Africa become profitable, envied, and loved. They provide a suite of products to help businesses accept payments online and offline, manage their operations, and grow their business. Our client is driven by a commitment to excellence, innovation, and customer satisfaction.

Data engineering with our client focuses on building and extending platforms for managing data s involves data ingestion, processing, storage and egress. Data engineers are also responsible for creating and maintaining the infrastructure our data platforms run on.

Data engineers operate across a diverse tech stack. They are expected to be adaptable and drawn to learning new skills and technologies.

The role requires a proactive individual who can work independently and collaboratively within a remote-first environment, has a strong software engineering background with good experience building and maintaining data pipelines, expertise in Python and experience in streaming technologies.

Job Type

Permanent/ Full time

Location

Cape Town, South Africa

Workplace

Hybrid

Department

Data Engineering

Reports To

Data Engineering Lead

Requirements
  • Educational Background: Bachelor's degree in Computer Science, Engineering or a related field.
  • Programming

    Skills:

    Proficiency in Python is essential. Knowledge of JavaScript and Scala is advantageous.
  • Data Engineering

    Experience:

    Minimum of 3 years of experience in data engineering roles, with a focus on building and managing data pipelines.
  • Software Engineering

    Experience:

    Minimum of 2 years experience in software and/or application development roles (can be concurrent with data engineering experience)
  • Streaming Technologies: Hands-on experience with Kafka, Debezium, and Kafka Connect.
  • Data pipeline orchestration tools: Proficiency in a data pipeline orchestration tool or suitable workflow orchestration tool like Apache Airflow (preferred), Databricks, Dagster or Airbyte.
  • Database Expertise: Strong understanding and hands-on experience working with various database technologies, including MySQL, Postgre

    SQL, Mongo

    DB and Redshift (Big Query and Single Store advantageous)
  • Infrastructure Tools: Experience with Terraform, Kubernetes, and Helm for infrastructure management.
  • Cloud Computing: Solid knowledge of cloud computing concepts, with experience in AWS services being advantageous.
  • SQL Proficiency: Ability to write complex SQL queries across different dialects.
  • Testing Practices: Familiarity with unit and integration testing methodologies.
  • CI/CD Pipelines: Experience in setting up and maintaining CI/CD pipelines.
  • Data Science Tools: Exposure to analytical systems and basic data science tooling.
  • Familiarity with basic machine learning and analytical modelling concepts is advantageous.
  • BI Reporting Platforms: Exposure to self-service reporting tools like Tableau, Looker and DOMO.
  • Communication: Good verbal and written communication skills, with the ability to convey complex technical concepts to non-technical stakeholders.
  • Team

    Collaboration:

    Demonstrated ability to work collaboratively within a team and across departments.
  • Adaptability: Comfortable working in a fast-paced environment with changing priorities, technologies and tooling. Life-long learners will do well here.
  • Problem-Solving: Strong analytical and problem-solving skills.
  • 1 year of start up experience is required
Responsibilities
  • Data Pipeline Development: Design, develop, and maintain robust data pipelines using ETL and ELT methodologies to process and integrate data from various sources into a data lake, a central data warehouse, operational data stores, analytical data marts and various application interfaces.
  • Streaming Data Processing: Implement and manage real-time data streaming solutions utilising Kafka, Debezium, Kafka Connect.
  • Workflow Orchestration: Build, schedule and maintain custom workflows using Apache Airflow to ensure timely and accurate data processing and delivery.
  • Database Management: Work with a variety of database technologies, including relational databases (MySQL, Postgre

    SQL), No

    SQL databases (Mongo

    DB) and analytical/big data…
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary