×
Register Here to Apply for Jobs or Post Jobs. X

Confluent Kafka Developer

Job in 411001, Pune, Maharashtra, India
Listing for: Nice Software Solutions Pvt. Ltd.
Full Time position
Listed on 2026-02-14
Job specializations:
  • IT/Tech
    Data Engineer, Data Security
Job Description & How to Apply Below
Kafka/Confluent Developer (Banking Domain)

Location:

Pune/Nagpur

Experience:

5+ Years

About the Role

We are seeking a highly skilled Kafka/Confluent Developer to design, build, and optimize real-time data integration and streaming solutions across our banking systems. This role requires strong expertise in Java, Confluent Kafka, and Kafka Connect APIs to ensure high-throughput, low-latency event-driven architectures that support mission-critical banking use cases.

Key Responsibilities

• Develop and maintain event-driven architectures using Confluent Kafka for real-time integration

across core banking, CRM, fraud detection, and compliance systems.

• Design and implement Kafka producers and consumers to handle high-volume, low-latency banking transactions.

• Build reusable streaming components using Kafka Streams and ksql

DB for fraud detection,

customer notifications, and operational alerts.

• Collaborate with the Data Governance team to ensure data lineage, quality, and metadata standards are upheld.

• Enforce schema evolution best practices with Confluent Schema Registry to manage compatibility

across applications.

• Develop custom Kafka Connectors (Source/Sink) and implement robust error handling, retries, and logging.

• Integrate Kafka with external systems such as databases, REST APIs, and SOAP services.

• Work with Dev Ops, cybersecurity, and platform teams to ensure seamless deployment, monitoring, and security compliance.

• Partner with business units (Retail, Islamic Finance, Risk, Compliance) to gather requirements and

translate them into scalable Kafka-based solutions.

• Support data platform architects and project managers with integration roadmaps and impact assessments.

• Enable real-time use cases such as customer onboarding status, transaction streaming, digital

engagement analytics, and branch performance monitoring.

Required Skills & Experience

• Strong Java proficiency (must-have).

Hands-on experience with Confluent Kafka and Kafka Connect APIs (Source/Sink connectors, Task

interfaces).

• Deep understanding of Kafka topics, partitions, offset management, and replication.

• Proficiency in schema handling (Schema Builder, Struct, Schema evolution).

• Strong error handling, retries, and connector debugging using Confluent Connect Logs.

• Experience integrating Kafka with databases, REST APIs, and SOAP services.

• Proficiency in SQL for query optimization and data validation.

• Knowledge of data governance, security standards, and compliance in banking/financial systems.

• Familiarity with Kafka Streams and ksql

DB for real-time analytics.

Preferred Qualifications

• Experience in banking, fintech, or financial services domain.

• Exposure to cloud platforms (AWS, Azure, GCP) for Kafka deployment.

• Familiarity with CI/CD pipelines and containerization (Docker/Kubernetes).

• Strong collaboration skills with cross-functional teams. (Dev Ops, Business Analytics, Risk, Compliance)
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary