Lead Java Developer - VP
Listed on 2025-12-20
-
Software Development
Data Engineer
Lead Java Developer - VP
Citi
About The Team
The Fixed Income Data team is a strategic and integral part of supporting Citi's GSP/Rates businesses, delivering exceptional capabilities in advanced business analysis, strategic project management, cutting‑edge application development, robust quality assurance, and seamless integration with proprietary technology solutions. Our mission is to architect, build, and operate high‑performance, resilient data platforms that empower critical financial operations and drive real‑time decision‑making across the firm.
The Opportunity
We are seeking a highly accomplished Java engineer to join our Realtime Risk Data team. This role is at the forefront of architecting, leading, and significantly enhancing our comprehensive real‑time risk data acquisition, processing, and distribution framework. You will serve as a technical thought leader, driving the strategic adoption and implementation of cutting‑edge streaming technologies such as Apache Kafka for high‑throughput data ingestion, Apache Flink for complex event processing and real‑time analytics, and Apache Pinot for ultra‑low‑latency OLAP queries on large datasets.
You will also oversee the management of petabyte‑scale datasets on S3, enabling efficient querying via Trino. This role demands deep technical expertise, a strategic mindset, and the ability to mentor and guide engineering teams in designing, developing, and optimizing mission‑critical, high‑performance real‑time data solutions that directly impact global financial operations.
- Architectural Leadership:
Serve as the principal architect for scalable, high‑performance Java‑based real‑time data solutions, ensuring robust design for high availability, fault tolerance, and resilience for both real‑time and EOD risk processes. - Strategic Implementation:
Drive the strategic implementation and optimization of distributed stream processing frameworks (Apache Kafka, Apache Flink) and real‑time data storage technologies (Apache Pinot) for ultra‑low‑latency analytics and complex event processing. - Data Pipeline Mastery:
Lead the end‑to‑end design, development, and operation of real‑time streaming data pipelines, integrating with large‑scale object storage solutions like S3 and analytics engines such as Trino. - Technical Excellence & Mentorship:
Champion continuous improvement in data reliability, efficiency, and scalability. Establish and enforce best practices for code quality, performance optimization, and system resilience through hands‑on leadership and thorough peer code reviews. Mentor and technically guide senior and lead developers. - SDLC Ownership:
Drive significant contributions across all phases of the Agile software development lifecycle, from architectural vision and detailed design to implementation, deployment, monitoring, and ongoing support for critical real‑time data systems. - Cross‑Functional
Collaboration:
Collaborate strategically with business analysts, product managers, quality assurance teams, and other engineering leads to ensure the delivery of seamlessly integrated, high‑impact technology solutions that align with business objectives and architectural standards. - Innovation & Research:
Stay abreast of industry trends and emerging technologies in real‑time data processing, distributed systems, and cloud‑native architectures, evaluating and proposing their adoption where beneficial.
- Senior Data Engineering Expertise: 7+ years of progressive experience in data engineering and software development, with a significant focus on building high‑performance, large‑scale distributed systems.
- Java Mastery:
Expert‑level command of Java (version 11 or higher) with a deep understanding of concurrent programming, multithreading, advanced OOP concepts, design patterns, and performance tuning. - Real‑time Streaming Core:
Proven, hands‑on production experience and deep architectural understanding of: - Apache Kafka or related technologies:
For high‑throughput, fault‑tolerant message queuing and streaming. - Apache Flink or related technologies:
For advanced real‑time stream processing, complex event processing (CEP), and…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).