×
Register Here to Apply for Jobs or Post Jobs. X

Architect, Data Engineer

Job in Cape Town, 7100, South Africa
Listing for: RELX
Full Time position
Listed on 2026-01-01
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Job Description & How to Apply Below

Would you like to shape the future of data platforms and drive impactful software innovations?
Do you thrive in collaborative, customer-focused environments where your ideas help guide strategic decisions?

About The Business

At Lexis Nexis Intellectual Property (LNIP), we believe that whenever a person works on a patent and understands the future trajectory of a specific technology, that person has the potential to fundamentally change how society operates. We are proud to directly support and serve these innovators in their endeavours to better humankind. We enable innovators to accomplish more by helping them make informed decisions, be more productive, comply with regulations, and ultimately achieve superior results.

By harnessing the latest advances in machine learning combined with expert analysis, Lexis Nexis Intellectual Property is disrupting how actionable insight is extracted from patent data. Information can now be accessed with efficiency, accuracy and at a speed that is just not possible by traditional methods. Our overall success is measured by how well we deliver these results.

About Our Team

Lexis Nexis Intellectual Property, which serves customers in more than 150 countries with 11,300 employees worldwide, is part of RELX, a global provider of information-based analytics and decision tools for professional and business customers.

About The Role:

At Lexis Nexis Intellectual Property (LNIP), our mission is to bring clarity to innovation by delivering better outcomes to the innovation community. We help innovators make more informed decisions, be more productive, and ultimately achieve superior results. By helping our customers achieve their goals, we support the development of new technologies and processes that ultimately advance humanity.

We are looking for a Data Architect with proven experience designing and implementing data platforms using Databricks
. In this mid-level role, you will play a critical part in architecting scalable data solutions that drive analytics, data science, and business intelligence efforts. You will work cross-functionally with engineering, analytics, and infrastructure teams to transform raw data into valuable enterprise assets.

Key Responsibilities:

  • Designing and implementing cloud-native data architectures using Databricks and technologies such as Delta Lake, Spark, and MLflow.

  • Developing and maintaining robust data pipelines, including batch and streaming workloads, to support data ingestion, processing, and consumption.

  • Collaborating with business stakeholders and analytics teams to define data requirements, data models, and data integration strategies.

  • Ensuring data architecture solutions are secure, scalable, and high performing, adhering to enterprise standards and best practices.

  • Leading technical efforts in data quality, metadata management, data cataloguing, and governance (including Unity Catalogue if applicable).

  • Providing technical guidance to junior engineers and analysts in the adoption of modern data architecture patterns.

  • Evaluating and recommending emerging tools and frameworks within the Databricks ecosystem and broader data engineering space.

  • Having a solid understanding of analytics engines and columnar databases to support performance-optimised data solutions.

  • Experience with full-text search platforms is highly desirable; familiarity with technologies like Elasticsearch or Solr is a strong advantage.

  • Requirements:

  • Bachelor’s or master’s degree in computer science,
    Information Systems
    , Engineering
    , or a related field.

  • Hands-on experience in data architecture
    , data engineering
    , or a similar role.

  • Deep expertise in Databricks
    , including Spark (PySpark/Scala),
    Delta Lake
    , and orchestration within Databricks workflows.

  • Strong understanding of cloud infrastructure and data services on at least one major cloud platform (Azure preferred, but AWS or GCP also accepted).

  • Proficiency in data modelling
    , SQL
    , data warehousing
    , and ETL frameworks
    .

  • Hands-on experience with CI/CD pipelines
    , version control (Git), and Dev Ops practices.

  • Solid understanding of data governance
    , privacy
    , and security best practices.

  • Nice to Have:

  • Databricks certifications (e.g., Data…

  • Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
    To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
     
     
     
    Search for further Jobs Here:
    (Try combinations for better Results! Or enter less keywords for broader Results)
    Location
    Increase/decrease your Search Radius (miles)

    Job Posting Language
    Employment Category
    Education (minimum level)
    Filters
    Education Level
    Experience Level (years)
    Posted in last:
    Salary