×
Register Here to Apply for Jobs or Post Jobs. X

Sr. Kafka Engineer

Job in Newark, Essex County, New Jersey, 07175, USA
Listing for: Broadridge
Full Time position
Listed on 2025-12-01
Job specializations:
  • IT/Tech
    Cloud Computing, Data Engineer
Job Description & How to Apply Below

Overview

Join to apply for the Sr. Kafka Engineer role at Broadridge
.

Broadridge is hiring a Sr. Kafka Engineer. As the Kafka Platform Director, you’ll lead the strategy, design, and operations of large-scale event streaming solutions with Confluent Cloud and Kafka. You’ll drive automation, security, and performance across hybrid and multi-cloud environments, ensuring the platform is resilient, scalable, and future-ready. Partner with cross-functional teams to power real-time data streaming that fuels innovation and critical business insights.

Direct message the job poster from Broadridge.

Responsibilities
  • Architect, design, and implement Kafka-based solutions using Confluent Cloud and Confluent Platform, ensuring they are highly scalable, resilient, and future-proof.
  • Provide technical leadership in designing event-driven architectures that integrate with on-prem systems and multiple cloud environments (AWS, Azure, or GCP).
  • Oversee administration and operational management of Confluent Platform components:
    Kafka brokers, Schema Registry, Kafka Connect, ksql

    DB, and REST Proxy.
  • Develop and maintain Kafka producers, consumers, and streams applications to support real-time data streaming use cases.
  • Lead deployments and configurations of Kafka topics, partitions, replication strategies in both on-prem and cloud setups.
  • Automate provisioning, deployment, and maintenance tasks with Terraform, Chef, Ansible, Jenkins, or similar CI/CD tools.
  • Implement robust monitoring, alerting, and observability frameworks using Splunk, Datadog, Prometheus, or similar tools for both Confluent Cloud and on-prem clusters.
  • Proactively troubleshoot Kafka clusters, diagnose performance issues, and conduct root cause analysis for complex, distributed environments.
  • Conduct capacity planning and performance tuning to optimize Kafka clusters and ensure they can handle current and future data volumes.
  • Define and maintain SLA/SLI metrics to track latency, throughput, and downtime.
  • Ensure secure configuration of all Kafka and Confluent components, implementing best practices for authentication (Kerberos/OAuth), encryption (SSL/TLS), and access control (RBAC).
  • Collaborate with Info Sec teams to stay compliant with internal and industry regulations (GDPR, SOC, PCI, etc.).
  • Work with Dev Ops, Cloud, Application, and Infrastructure teams to define and align business requirements for data streaming solutions.
  • Provide guidance and support during platform upgrades, expansions, and new feature rollouts.
  • Stay current with Confluent Platform releases and Kafka community innovations.
  • Drive continuous improvement by recommending new tools, frameworks, and processes to enhance reliability and developer productivity.
Qualifications
  • 5+ years of hands-on experience with Apache Kafka; at least 2+ years focused on Confluent Cloud and Confluent Platform.
  • Deep knowledge of Kafka Connect, Schema Registry, Control Center, ksql

    DB, and other Confluent components.
  • Experience architecting and managing hybrid Kafka solutions in on-prem and cloud (AWS, Azure, GCP).
  • Advanced understanding of event-driven architecture and the real-time data integration ecosystem.
  • Strong programming/scripting skills (Java, Python, Scala) for Kafka-based application development and automation tasks.
  • Hands-on experience with Infrastructure as Code (Terraform, Cloud Formation) for Kafka resource management in both cloud and on-prem.
  • Familiarity with Chef, Ansible, or similar configuration management tools to automate deployments.
  • Skilled in CI/CD pipelines (e.g., Jenkins) and version control (Git) for distributed systems.
  • Proven ability to monitor and troubleshoot large-scale, distributed Kafka environments using Splunk, Datadog, Prometheus, or similar tools.
  • Experience with performance tuning and incident management to minimize downtime and data loss.
  • Expertise in securing Kafka deployments, including Kerberos and SSL configurations, and understanding of IAM, network security, encryption, and governance in hybrid environments.
  • Demonstrated experience leading platform upgrades, migrations, and architecture reviews; excellent communication skills for diverse audiences.
  • Bachelor’s or Master’s degree in…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary