Infrastructure and Pipeline Developer
Listed on 2026-02-14
-
IT/Tech
Data Engineer, Database Administrator, Data Analyst
Responsibilities
As an Infrastructure and pipeline developer you offer data integrity and data security management development experience and/or capabilities.
Pipeline Architecture Development:
Design, construct and maintain scalable data pipelines to extract usable data from disparate sources
ETK/ELT Data Development
Carry out complex preprocession of both structured and unstructured data, ensuring it is ready for data ingestion and further analysis
Data Management:
Optimise and maintain Postgre
SQL databases, utilising SQL Alchemy and SQL for efficient data manipulation
Data Integrity & Validation
Develop automated scripts to validate the integrity of data and maintain high accuracy in data-related tasks
Infrastructure Enhancement
Improve data collections procedures and storage systems to include all relevant metadata for analytical systems
API Integration:
Build and manage API integrations to streamline data flow between third-party tools (e.g. OpenAI SDK, Gemini) and internal databases
Performance Tuning:
Apply logical and methodical approaches to identify and solve bottlenecks in data processing
Qualifications- Degree in Mathematics and/or Physics and/or Computer Science
- 0,5 to 2 years’ experience in the following:
Data Engineering, Backend Development, Research Heavy Data Env.
- Microsoft PowerBI
- Expert SQL
- Python (Pandas, Numpy)
- Microsoft Excel
- Django/Flask for data services
- PostgreSQL
- SQL Alchemy
- SQL
- OpenAI
- SDK
- Gemini
- Cloud based Environments such as Google Colab and/or Jupyter
- High attention to detail
- Ability to plan activities to meet strict deadlines
- English - fluently
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search: