×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer

Job in Glasgow, Glasgow City Area, G1, Scotland, UK
Listing for: Data Freelance Hub
Full Time position
Listed on 2025-12-29
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Job Description & How to Apply Below

This role is a Senior Data Engineer on a 24-month fixed‑term contract, with a pay rate of "unknown". The position is remote and requires 5+ years of data engineering experience, advanced expertise in Databricks, and a Databricks Certified Data Engineer Professional certification.

Location:

Glasgow, Scotland, United Kingdom (Remote)

Overview

Do you want to work to make Power for Good? We're the world's largest independent renewable energy company. We're driven by a simple yet powerful vision: to create a future where everyone has access to affordable, zero carbon energy. We know that achieving our ambitions would be impossible without our people. Because we're tackling some of the world's toughest problems, we need the very best people to help us.

They're our most important asset so that's why we continually invest in them. RES is a family with a diverse workforce, and we are dedicated to the personal professional growth of our people, no matter what stage of their career they're at. We can promise you rewarding work which makes a real impact, the chance to learn from inspiring colleagues from across a growing, global network and opportunities to grow personally and professionally.

Our competitive package offers rewards and benefits including pension schemes, flexible working, and top‑down emphasis on better work‑life balance. We also offer private healthcare, discounted green travel, 25 days holiday with options to buy/sell days, enhanced family leave and four volunteering days per year so you can make a difference somewhere else.

Position

We are looking for a Senior Data Engineer with advanced expertise in Databricks to lead the development of scalable data solutions within our asset performance management software, part of our Digital Solutions business.

This role involves architecting complex data pipelines, mentoring junior engineers, and driving best practices in data engineering and cloud analytics. You will play a key role in shaping our data strategy which is the backbone of our software and enabling high‑impact analytics and machine‑learning initiatives.

Accountabilities
  • Design and implement scalable, high-performance data pipelines.
  • Work with the lead cloud architect on the design of data lakehouse solutions leveraging Delta Lake and Unity Catalog.
  • Collaborate with cross‑functional teams to define data requirements, governance standards, and integration strategies.
  • Champion data quality, lineage, and observability through automated testing, monitoring, and documentation.
  • Mentor and guide junior data engineers, fostering a culture of technical excellence and continuous learning.
  • Drive the adoption of CI/CD and Dev Ops practices for data engineering workflows.
  • Stay ahead of emerging technologies and Databricks platform updates, evaluating their relevance and impact.
Knowledge
  • Deep understanding of distributed data processing, data lakehouse architecture, and cloud‑native data platforms.
  • Optimization of data workflows for performance, reliability, and cost‑efficiency on cloud platforms (particularly Azure but experience with AWS and/or GCP would be beneficial).
  • Strong knowledge of data modelling, warehousing, and governance principles.
  • Knowledge of data privacy and compliance standards (e.g., GDPR, HIPAA).
  • Understanding of OLTP and OLAP and what scenarios to deploy them in.
  • Understanding of incremental processing patterns.
Skills
  • Strong proficiency in Python and SQL. Experience working with Scala would be beneficial.
  • Proven ability to design and optimize large‑scale ETL/ELT pipelines.
  • Building and managing orchestrations.
  • Excellent oral and written communication, both within the team and with our stakeholders.
Experience
  • 5+ years of experience in data engineering, with at least 2 years working extensively with Databricks and orchestrated pipelines such as DBT, DLT, or workflows using jobs.
  • Experience with Delta Lake and Unity Catalog in production environments.
  • Experience with CI/CD tools and version control systems (e.g., Git, Git Hub Actions, Azure Dev Ops, Databricks Asset Bundles).
  • Experience with real‑time data processing, both batch and streaming.
  • Experience working on machine learning workflows and…
Position Requirements
10+ Years work experience
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary