More jobs:
Data Engineering Tech Lead
Job in
Killamarsh, Derbyshire, S21 1AA, England, UK
Listed on 2026-02-02
Listing for:
Test Triangle
Full Time
position Listed on 2026-02-02
Job specializations:
-
IT/Tech
Data Engineer, Systems Engineer
Job Description & How to Apply Below
Overview
Role: Data Engineering Tech Lead
Job Category: GCB4
Location: UK
Responsibilities- Design, implement, and maintain data pipelines to ingest and process Open Shift telemetry (metrics, logs, traces) at scale.
- Stream Open Shift telemetry via Kafka (producers, topics, schemas) and build resilient consumer services for transformation and enrichment.
- Engineer data models and routing for multi-tenant observability; ensure lineage, quality, and SLAs across the stream layer.
- Integrate processed telemetry into Splunk for visualisation, dashboards, alerting, and analytics to achieve Observability Level 4 (proactive insights).
- Implement schema management (Avro/Protobuf), governance, and versioning for telemetry events.
- Build automated validation, replay, and backfill mechanisms for data reliability and recovery.
- Instrument services with Open Telemetry; standardise tracing, metrics, and structured logging across platforms.
- Use LLMs to enhance observability capabilities (e.g., query assistance, anomaly summarization, Runbook generation).
- Collaborate with platform, SRE, and application teams to integrate telemetry, alerts, and SLOs.
- Ensure security, compliance, and best practices for data pipelines and observability platforms.
- Document data flows, schemas, dashboards, and operational Runbook.
- Hands-on experience building streaming data pipelines with Kafka (producers/consumers, schema registry, Kafka Connect/KSQL/KStream).
- Proficiency with Open Shift/Kubernetes telemetry (Open Telemetry, Prometheus) and CLI tooling.
- Experience integrating telemetry into Splunk (HEC, UF, source types, CIM), building dashboards and alerting.
- Strong data engineering skills in Python (or similar) for ETL/ELT, enrichment, and validation.
- Knowledge of event schemas (Avro/Protobuf/JSON), contracts, and backward/forward compatibility.
- Familiarity with observability standards and practices; ability to drive toward Level 4 maturity (proactive monitoring, automated insights).
- Understanding of hybrid cloud and multi-cluster telemetry patterns.
- Security and compliance for data pipelines: secret management, RBAC, encryption in transit/at rest.
- Good problem-solving skills and ability to work in a collaborative team environment.
- Strong communication and documentation skills.
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×