×
Register Here to Apply for Jobs or Post Jobs. X

Kafka Engineer

Job in Virginia, St. Louis County, Minnesota, 55792, USA
Listing for: CACI International Inc
Full Time position
Listed on 2025-12-11
Job specializations:
  • IT/Tech
    Data Engineer, Data Science Manager
Job Description & How to Apply Below

Job Category:
Information Technology. Time Type:
Full time. Minimum Clearance Required to Start:
None. Employee Type:
Regular. Percentage of

Travel Required:

Up to 10%. Type of Travel:
Local.

The Opportunity

CACI is seeking a Kafka Engineer to join our team and support the Border Enforcement Applications for Government Leading-Edge Information Technology (IT) (BEAGLE) contract. You will have the opportunity to apply your knowledge, skills and experience to building a truly modern application that is new development and cloud native. If you thrive in a culture of innovation and bring creative ideas to solve complex technical and procedural problems at the team and portfolio levels, then this opportunity is for you!

Join this passionate team of industry-leading individuals supporting best practices in agile software development for the Department of Homeland Security (DHS). You will support the men and women charged with safeguarding the American people and enhancing the nation’s safety and security.

Responsibilities
  • Serve as an Agile Scrum team member providing software development support and maintenance for the delivery of releasable software in short sprint cycles. Responsible for activities associated with delivery of software solutions associated with customer-defined systems and software projects by working in close collaboration with software developers/engineers,stakeholders, and end users within Agile processes.

    Responsibilities include:
  • Design, develop, and deploy high-performance Kafka producers, consumers, and stream processing applications (using Kafka Streams, ksql

    DB, Flink, or Spark Streaming) in Java.
  • Collaborate with architects and other engineering teams to define and evolve our event‑driven architecture, ensuring best practices for Kafka topic design, partitioning, replication, and data retention.
  • Implement and manage components of the Kafka ecosystem, including Kafka Connect (source and sink connectors), Schema Registry (Avro, Protobuf), and Kafka security features.
  • Monitor, troubleshoot, and optimize Kafka clusters and Kafka‑dependent applications for throughput, latency, reliability, and resource utilization.
  • Build and maintain robust and resilient data pipelines for real‑time ingestion, transformation, and distribution of data across various systems.
  • Provide operational support for Kafka‑based systems, including incident response, root cause analysis, and proactive maintenance to ensure high availability and reliability.
  • Enforce data contract definitions and schema evolution strategies using Schema Registry to maintain data quality and compatibility across services.
  • Implement comprehensive testing strategies for Kafka applications, including unit, integration, and end‑to‑end tests, ensuring data integrity and system reliability.
  • Create and maintain detailed technical documentation, architectural diagrams, and operational runbooks for Kafka‑related components and processes.
  • Act as a subject matter expert, sharing knowledge, mentoring junior engineers, and championing Kafka best practices across the organization.
Qualifications
  • Must be a U.S. Citizen with the ability to pass CBP background investigation, criteria include but are not limited to: 1 year check for misconduct such as theft or fraud, 1 year check for illegal drug use, 3 year check for felony convictions.
  • Extensive hands‑on experience designing, developing, and deploying applications using Apache Kafka (producers, consumers, topic management, consumer groups).
  • Deep understanding of Kafka's internal architecture, guarantees (at‑least‑once, exactly‑once), offset management, and delivery semantics.
  • Experience with Kafka Streams API or other stream processing frameworks (e.g., Flink, Spark Streaming with Kafka).
  • Programming proficiency: high-level proficiency in at least one modern backend programming language suitable for Kafka development (Java strongly preferred).
  • Strong understanding of distributed systems principles, concurrency, fault tolerance, and resilience patterns.
  • Experience with data serialization formats such as Avro, Protobuf, or JSON Schema, and their use with Kafka Schema Registry.
  • Solid understanding of relational and/or…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary