×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer

Remote / Online - Candidates ideally in
City of Edinburgh, Edinburgh, City of Edinburgh Area, EH1, Scotland, UK
Listing for: Br Dge
Full Time, Remote/Work from Home position
Listed on 2026-02-19
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst, Data Science Manager, Cloud Computing
Salary/Wage Range or Industry Benchmark: 60000 - 80000 GBP Yearly GBP 60000.00 80000.00 YEAR
Job Description & How to Apply Below
Location: City of Edinburgh

Remote with the option to use the Edinburgh office - Permanent/Full Time

BR-DGE is an award winning Fin Tech founded in Edinburgh. Our platform enables e-commerce and technology businesses to have the freedom and flexibility to redefine the way they handle payments.

Since our inception in 2018 we have been leading the way in the future of payment orchestration. Our products enable enterprise businesses to optimise their payment infrastructure and create frictionless digital payment experiences for their end users. Now with a global reach, our customer base is made up of incredible brands and household names from across the travel, retail and gambling sectors and it’s growing fast!

Our world class partners include Visa and Worldpay and we’re continuing to build a strong partner network with the biggest players in the payments industry. It’s an exciting time to be part of  
-DGE!

The journey so far has been incredible, but we’re just getting started and with ambitious growth plans, we’re now looking for more exceptional talent to join our team.

All  
-DGE Builders receive the following benefits:
  • Flexible and remote working
  • Remote working allowance
  • 33 days holiday including public holidays
  • Your birthday as a day off
  • Family healthcare
  • Life insurance
  • Employee assistance programme
  • A culture that champions rapid career progression
  • Investment in your learning and development
  • Regular team events & socials
Become a  
-DGE Builder

Why this role exists

  • 1. Data is becoming a critical part of   DGE’s next growth phase, powering internal analytics and customer facing insights and monitoring.
  • 2. The data engineering space is largely greenfield. We need a production grade data platform that can ingest, transform, validate, and monitor data from core systems and operational tooling.
  • 3. The robustness, scalability, and governance of our data architecture impacts our ability to grow safely and meet regulatory expectations.
  • 4. This role owns the insights data platform, while partnering closely with Analytics, Product, and Engineering to ensure the platform delivers trusted datasets and timely signals.
What you will do
  • 1. Design and ship a tiered data platform that supports multiple latency needs, including low latency pipelines for operational monitoring and customer facing insights, plus batch pipelines for reporting and deeper analysis.
  • 2. Build and own end-to-end ingestion patterns across batch, micro batch, and selected near real time use cases, with strong orchestration and dependency management.
  • 3. Implement schema evolution, data contracts, and approaches for late arriving and corrected data so consumers can trust the outputs.
  • 4. Treat curated datasets as products that are well defined, documented, reliable, and safe to use for both internal and external consumers.
  • 5. Set platform standards for idempotent ingestion, deduplication, data quality, lineage, and observability.
  • 6. Ensure the platform meets regulated fintech & payments expectations for access control, security, and governance while staying cost efficient as volumes grow.
  • 7. Partner with Product and Engineering on event and domain modelling. Decide what data gets emitted and what latency and granularity is needed for analytics and product goals.
  • 8. Support Data Science with reliable feature ready datasets and pragmatic collaboration, without owning reporting or business analysis.
  • 9. Evolve the current lightweight tooling into a more observable, structured platform. Improve standards without creating unnecessary platform complexity.
  • 10. Automate data infrastructure and workflows using infrastructure as code and CI CD practices.
What we are looking for Must have
  • 1. Proven experience designing, building, and operating production grade data pipelines and platforms.
  • 2. Strong SQL, specifically Postgre

    SQL, plus at least one programming language such as Python or Java.
  • 3. Experience with data processing or orchestration tooling such as Spark, Airflow, or Kafka.
  • 4. Experience designing data models for analytics and reporting workloads.
  • 5. Practical knowledge of data quality, testing, observability, lineage, and governance patterns.
  • 6. Strong experience with AWS based data platforms, with…
Position Requirements
10+ Years work experience
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary