Kafka Engineer
Listed on 2025-12-11
-
IT/Tech
Data Engineer, Data Science Manager, Data Analyst
Job Category:
Information Technology. Time Type:
Full time. Minimum Clearance Required to Start:
None. Employee Type:
Regular. Percentage of
Travel Required:
Up to 10%. Type of Travel:
Local.
CACI is seeking a Kafka Engineer to join our team and support the Border Enforcement Applications for Government Leading-Edge Information Technology (IT) (BEAGLE) contract. You will have the opportunity to apply your knowledge, skills and experience to building a truly modern application that is new development and cloud native. If you thrive in a culture of innovation and bring creative ideas to solve complex technical and procedural problems at the team and portfolio levels, then this opportunity is for you!
Join this passionate team of industry-leading individuals supporting best practices in agile software development for the Department of Homeland Security (DHS). You will support the men and women charged with safeguarding the American people and enhancing the nation’s safety and security.
Responsibilities- Serve as an Agile Scrum team member providing software development support and maintenance for the delivery of releasable software in short sprint cycles. Responsible for activities associated with delivery of software solutions associated with customer-defined systems and software projects by working in close collaboration with software developers/engineers,stakeholders, and end users within Agile processes.
Responsibilities include: - Design, develop, and deploy high-performance Kafka producers, consumers, and stream processing applications (using Kafka Streams, ksql
DB, Flink, or Spark Streaming) in Java. - Collaborate with architects and other engineering teams to define and evolve our event‑driven architecture, ensuring best practices for Kafka topic design, partitioning, replication, and data retention.
- Implement and manage components of the Kafka ecosystem, including Kafka Connect (source and sink connectors), Schema Registry (Avro, Protobuf), and Kafka security features.
- Monitor, troubleshoot, and optimize Kafka clusters and Kafka‑dependent applications for throughput, latency, reliability, and resource utilization.
- Build and maintain robust and resilient data pipelines for real‑time ingestion, transformation, and distribution of data across various systems.
- Provide operational support for Kafka‑based systems, including incident response, root cause analysis, and proactive maintenance to ensure high availability and reliability.
- Enforce data contract definitions and schema evolution strategies using Schema Registry to maintain data quality and compatibility across services.
- Implement comprehensive testing strategies for Kafka applications, including unit, integration, and end‑to‑end tests, ensuring data integrity and system reliability.
- Create and maintain detailed technical documentation, architectural diagrams, and operational runbooks for Kafka‑related components and processes.
- Act as a subject matter expert, sharing knowledge, mentoring junior engineers, and championing Kafka best practices across the organization.
- Must be a U.S. Citizen with the ability to pass CBP background investigation, criteria include but are not limited to: 1 year check for misconduct such as theft or fraud, 1 year check for illegal drug use, 3 year check for felony convictions.
- Extensive hands‑on experience designing, developing, and deploying applications using Apache Kafka (producers, consumers, topic management, consumer groups).
- Deep understanding of Kafka's internal architecture, guarantees (at‑least‑once, exactly‑once), offset management, and delivery semantics.
- Experience with Kafka Streams API or other stream processing frameworks (e.g., Flink, Spark Streaming with Kafka).
- Programming proficiency: high-level proficiency in at least one modern backend programming language suitable for Kafka development (Java strongly preferred).
- Strong understanding of distributed systems principles, concurrency, fault tolerance, and resilience patterns.
- Experience with data serialization formats such as Avro, Protobuf, or JSON Schema, and their use with Kafka Schema Registry.
- Solid understanding of relational and/or…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).