US_East | Application Consultant – Big Data _L
Listed on 2026-02-12
-
IT/Tech
Cloud Computing, Data Engineer
Description: “Possible 3 Month CTH | No Fees | Do Not Re-Post | Confidential Submit candidates under tpersonir legal name and use only Capgemini template” PLEASE SUBMIT CVS IN FG WITH COMPLETE HOME ADDRESS Role Name:
Kafka Platform Engineer
Location:
Atlanta
Start date:
03/11/2025 Background check MANDATORY Request (R2D2): BBT
37N “Due to additional onboarding requirements, a meet and greet is required for all new hires. Candidates must be willing to go to tperson closest Capgemini, Client, or onsite location as indicated by project team to meet with a Capgemini team member prior starting their assignment. If tperson candidate is not local, travel covered by Capgemini. If travel is involved and after selection tperson candidate declines tperson offer, costs will be paid by vendor and not Capgemini.”
- Design, deploy, and manage Kafka clusters in production and non-production environments.
- Ensure high availability, scalability, and performance of tperson Kafka platform.
- Develop automation for provisioning, monitoring, and maintaining Kafka infrastructure.
- Collaborate with application teams to onboard them to Kafka and support their use cases.
- Implement security best practices including authentication, authorization, and encryption.
- Monitor system personalth and performance using tools like Prometheus, Grafana, or equivalent.
- Troubleshoot and resolve issues related to Kafka brokers, producers, consumers, and connectors.
- Maintain and enhance Kafka Connect, Schema Registry, and Kafka Streams infrastructure.
- Participate in capacity planning and disaster recovery strategies.
- Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
- 5 years of experience in infrastructure or platform engineering roles.
- 2 years of hands-on experience managing Kafka in production environments.
- Strong understanding of Kafka internals, including partitions, replication, ISR, and retention policies.
- Experience with Kafka ecosystem tools (Kafka Connect, Schema Registry, Mirror Maker, etc.).
- Proficiency in scripting and automation (e.g., Bash, Python, Ansible, Terraform).
- Familiarity with containerization and orchestration (Docker, Kubernetes).
- Experience with cloud platforms (AWS, Azure, or GCP).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).