More jobs:
Job Description & How to Apply Below
Senior Data Engineer – Integrations (Italy)
Design and operate the data backbone that powers advanced blockchain analytics products used by teams and customers worldwide. Build reliable, scalable ingestion pipelines and well-modeled core datasets that others can confidently build on. Own systems that handle large volumes of streaming and batch data with a strong emphasis on correctness and stability. Collaborate closely with engineers, analysts, and product teams in a remote-first, high‑autonomy environment.
Accountabilities
Design, build, and scale high‑performance data pipelines and infrastructure using technologies such as Click House, Python, and dbt
Own the full lifecycle of data pipelines, from raw ingestion through transformation to clean, well‑defined datasets
Build and operate systems that support large‑scale streaming and batch processing with strong guarantees around correctness and reliability
Improve observability, data quality checks, and failure handling to ensure predictable performance at scale
Collaborate with downstream data consumers to define clear dataset contracts, schemas, and usage patterns
Contribute to core datasets that serve as long‑lived foundations for analytics, product features, and research
Leverage AI‑powered development tools and agents to accelerate delivery, automate repetitive tasks, and improve code quality
Continuously evolve tooling and practices by staying current with modern data engineering approaches
Requirements
Proven experience building and operating production‑grade data pipelines that run continuously and reliably
Strong fundamentals in data and software engineering, with deep expertise in Python and SQL
Hands‑on production experience with Click House and dbt
Solid understanding of streaming and batch ingestion patterns, including backfills, reprocessing, and schema evolution
Comfort working across the full data stack, including ingestion, transformation, storage, and exposure to downstream systems
Experience designing clean, reusable datasets intended for broad internal consumption
Familiarity with using AI‑assisted development tools in daily workflows, or strong curiosity to adopt them
Clear written and verbal communication skills suited to remote‑first and asynchronous collaboration
Pragmatic, ownership‑driven mindset with the ability to execute in complex environments
Experience with , or strong interest in, blockchain data, crypto ecosystems, and Web3 technologies
Benefits
Competitive compensation package aligned with a remote‑first setup
Flexible working model with autonomy over schedule and execution
Opportunity to work on massive‑scale data challenges in a rapidly growing industry
High‑impact role with influence on technical and product direction
Collaborative culture that values ownership, execution, and continuous improvement
Why Apply Through Jobgether?
We use an AI‑powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. Our system identifies the top‑fitting candidates, and this shortlist is then shared directly with the hiring company. The final decision and next steps are managed by their internal team.
#J-18808-Ljbffr
Position Requirements
10+ Years
work experience
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×