More jobs:
Job Description & How to Apply Below
We are seeking an experienced Confluent Kafka Engineer to design, build, and operate a scalable Kafka platform that enables self-service streaming for engineering teams. This is a hands-on engineering role focused on platform reliability, automation, governance, and developer enablement.
Key Responsibilities
Kafka Platform Engineering
- Design, build, and operate large-scale Confluent Kafka clusters (on-prem, cloud, or hybrid)
- Architect Kafka farms for high availability, fault tolerance, and performance
- Own capacity planning, scaling strategies, and Kafka lifecycle management
- Define best practices for topic design, partitioning, replication, and retention
Self-Service Enablement & Tooling
- Design and build internal APIs and services to enable Kafka self-service
(topic lifecycle, ACL provisioning, schema onboarding)
- Develop CLI tools and developer-facing utilities for standardized Kafka operations
- Build automation using Python-based frameworks to reduce manual effort
- Integrate Kafka tooling with CI/CD pipelines and internal developer platforms
- Ensure standards and guardrails are enforced through automation
Standards, Guardrails & Governance
- Define and enforce Kafka standards, conventions, and policies
- Implement guardrails around security, quotas, data ownership, and cost controls
- Establish governance for schemas, topics, access control, and data lifecycle
- Partner with security and compliance teams to meet regulatory requirements
Reliability & Operations
- Ensure Kafka platform stability, performance, and observability
- Implement monitoring, alerting, and incident response practices
- Lead root-cause analysis and continuous improvement efforts
- Plan and execute upgrades and platform evolution with minimal downtime
Collaboration & Enablement
- Partner with application, data, and platform engineering teams
- Provide documentation, examples, and best practices for Kafka usage
- Act as a Kafka subject-matter expert across the organization
Required Qualifications
- 7+ years of experience in software, platform, or infrastructure engineering
- 5+ years of hands-on experience with Apache Kafka and/or Confluent Platform
- Proven experience operating Kafka at scale in production environments
- Deep understanding of Kafka internals (brokers, KRaft/Zoo Keeper, replication, ISR)
- Experience building self-service platforms with automation and guardrails
- Strong Python programming skills for APIs, CLIs, and automation tooling
- Experience developing RESTful APIs and internal platform services
- Experience building and maintaining CLI tools
- Familiarity with automation frameworks and workflow orchestration
- Proficiency with infrastructure-as-code tools (Terraform, Ansible, etc.)
- Strong Linux and networking fundamentals
- Experience with monitoring and observability tools (Prometheus, Grafana)
Preferred Qualifications
- Experience with Kafka Connect, Schema Registry, and ksqlDB
- Experience building developer platforms or internal PaaS tooling
- Experience with Python frameworks (FastAPI, Flask, Click, Typer)
- Cloud experience (AWS, GCP, Azure) and/or Kubernetes-based Kafka deployments
- Experience operating multi-tenant Kafka platforms
- Knowledge of data governance, security, and compliance
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×