×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer

Job in City Of London, Central London, Greater London, England, UK
Listing for: Agio Ratings
Part Time position
Listed on 2025-12-30
Job specializations:
  • IT/Tech
    Data Engineer, Data Warehousing
Salary/Wage Range or Industry Benchmark: 70000 GBP Yearly GBP 70000.00 YEAR
Job Description & How to Apply Below
Location: City Of London

Join to apply for the Data Engineer role at Agio Ratings
.

Agio Ratings is a VC-backed risk analysis firm focused on the digital asset market. Founded in 2022, our all-star team of PhDs developed advanced models that capture the market's unique risk factors. We were early in flagging FTX’s high risk and in recognizing Bybit’s resilience following a $1.5B hack. Today, our ratings power risk teams at top trading firms, insurance companies, and banks worldwide.

With market and regulatory momentum driving demand, Agio Ratings is entering a new phase of growth. We’re seeking an energetic, creative, and experienced Data Engineer to scale mission‑critical capabilities and help us win the market.

Responsibilities
  • Design and implement scalable ETL pipelines using Apache Spark or Apache Flink.
  • Build real‑time streaming data pipelines to ingest blockchain transaction data.
  • Create data validation and QA frameworks to ensure pipeline reliability.
  • Design and optimise data schemas for high‑volume analytical databases.
  • Integrate with node APIs (Bitcoin Core, Geth, etc.) and 3rd‑party data vendors.
  • Implement horizontal scaling strategies for compute‑intensive data processing algorithms.
  • Design fault‑tolerant systems with proper error handling and recovery mechanisms.
Must‑have Requirements

This role is only open to candidates based in or willing to commute to London, UK at least 3 days a week.

  • Minimum 3 years’ experience in distributed computing:
    Apache Spark (PySpark/Scala), Apache Flink or equivalent.
  • Minimum 3 years’ experience in data warehousing:
    Click House/Snowflake, or similar DBs.
  • Minimum 3 years’ experience in data lakes: AWS S3/Glue, Azure Data Lake, GCP Big Query.
  • Proficiency in programming:
    Python, Scala, or Java for data pipeline development.
  • Experience with streaming platforms:
    Kafka, Pulsar, or other.
  • Experience with cloud platforms: AWS, Azure, or GCP data services.
Nice‑to‑have
  • Knowledge of blockchain data formats and parsing techniques.
  • Experience working with blockchain node APIs and RPC interfaces.
  • Knowledge of data modelling for graph‑based analysis.
  • Understanding of data compression and storage optimisation techniques.
What we offer
  • Competitive pay starting at £70,000 per year.
  • Equity ownership that grows as the company grows.
  • Comprehensive health insurance offered by Vitality.
  • A dynamic office in Central London with unlimited coffee, snacks and gym access.
#J-18808-Ljbffr
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary