×
Register Here to Apply for Jobs or Post Jobs. X

Cloud Architect - Kafka

Job in Woodlawn, Prince George's County, Maryland, USA
Listing for: Sky Solutions
Full Time position
Listed on 2025-12-05
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Job Description & How to Apply Below

1 day ago Be among the first 25 applicants

Location:
Woodlawn, MD (Onsite 5 days a week)

Note:
Selected candidate must be willing to work on-site in Woodlawn, MD 5 days a week.

Key Required Skills: Confluent Kafka, Apache Flink, Kafka Connect, Python, Java and Spring Boot.

  • Position Description: Lead and organize a team of Kafka administrators and developers, assign tasks, and facilitate weekly Kafka Technical Review meetings with the team.
  • Work alongside customer to determine expanded use of Kafka within the Agency.
  • Strategize within Client to set up opportunities to explore new technologies to use with Kafka.
  • Architect, design, code, and implement next-generation data streaming and event-based architecture / platform on Confluent Kafka.
  • Define strategy for streaming data to data warehouse, and integrating event-based architect with microservice based applications.
  • Establish Kafka best practices and standards for implementing the Kafka platform based on identified use cases and required integration patterns.
  • Mentor existing team members by imparting expert knowledge to build a high-performing team in our event-driven architecture. Assist developers in choosing correct patterns, event modelling and ensuring data integrity.
  • Provide software expertise in one or more of these areas: application integration, enterprise services, service-oriented architectures (SOA), security, business process management/business rules processing, data ingestion/data modeling.
  • Triage, investigate, advise in a hands-on capacity to resolve platform issues regardless of component.
  • Brief management, customer, team, or vendors using written or oral skills at appropriate technical level for audience. Share up-to-date insights on the latest Kafka-based solutions, formulate creative approaches to address business challenges, present and host workshops with senior leaders and translate technical jargons into layman's language and vice-versa.
  • All other duties as assigned or directed.
  • Basic Qualifications: Bachelor's Degree in Computer Science, Mathematics, Engineering, or a related field with 12 years of relevant experience OR Master degree with 10 years of relevant experience. Additional years of experience may be substituted/accepted in lieu of degree.
  • 12+ years of experience with modern software development including systems/application analysis and design.
  • 7+ years of combined experience with Kafka (Confluent Kafka and/or Apache Kafka).
  • 2+ years of combined experience with designing, architecting, and deploying to AWS cloud platform.
  • 1+ years of leading a technical team.
  • Must be able to obtain and maintain a Public Trust security clearance.

The selected candidate must be willing to work on-site in Woodlawn, MD 5 days a week.

Required Skills:

  • These skills will help you succeed in this position: Expert experience with Confluent Kafka with hands-on production experience, capacity planning, installation, administration / platform management, and a deep understanding of the Kafka architecture and internals.
  • Expert in Kafka cluster and application security.
  • Strong knowledge of Event Driven Architecture (EDA)
  • Expert Experience in data pipeline, data replication and/or performance optimization.
  • Kafka installation & partitioning on Open Shift or Kubernetes, topic management, HA & SLA architecture.
  • Strong knowledge and application of microservice design principles and best practices: distributed systems, bounded contexts, service-to-service integration patterns, resiliency, security, networking, and/or load balancing in large mission critical infrastructure.
  • Expert experience with Kafka Connect, KStreams, and KSQL, with the ability to know how to use effectively for different use cases.
  • Hands-on experience with scaling Kafka infrastructure including Broker, Connect, Zoo Keeper, Schema Registry, and/or Control Center.
  • Hands-on experience in designing, writing, and operationalizing new Kafka Connectors.
  • Solid experience with data serialization using Avro and JSON and data compression techniques.
  • Experience with AWS services such as ECS, EKS, Flink, Amazon RDS for Postgre

    SQL, and/or S3.
  • Basic knowledge of relational databases (Postgre

    SQL, DB2, or Oracle), SQL,…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary