×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Kafka Architect

Job in Dauphin, Brandon, Manitoba, Canada
Listing for: W3Global
Full Time position
Listed on 2026-03-13
Job specializations:
  • Software Development
    Data Engineer
Job Description & How to Apply Below
Location: Dauphin

Lead/Architect level requirement to assist on building out Kafka platform and specifically with focus on Flink;
Should be able to contribute to design, propose, and evaluate solutions. Should also be hands‑on with development and lead the Kafka development team.

Description
  • Expert‑level architecture and implementation experience of Flink applications on Confluent Platform, specifically for high‑volume, low‑latency stream processing.
  • Extensive experience architecting, implementing, and administering the Confluent Cloud Kafka and Flink platform in production environments.
  • Advanced proficiency in core Flink concepts including state management (Keyed/Operator State, Rocks

    DB), Exactly‑Once semantics, and configuring checkpointing and save points for fault tolerance.
  • Deep knowledge of Event Time processing, Watermarks (Bounded Out‑of‑Orderness), and complex Windowing (Tumbling, Sliding, Session) for accurate stream analytics.
  • Advanced knowledge of KSQL DB and KStreams for rapid development of real‑time stream processing/analytics alongside Flink.
  • Proven proficiency in Kafka Connectors (including Change Data Capture/CDC) from configuration to end‑to‑end integration in cloud environments.
  • Demonstrated experience applying Flink and Kafka in the Retail Industry for use cases such as real‑time inventory management, dynamic pricing, fraud detection, and personalized customer experience (e.g., clickstream analysis).
  • Strong background in platform governance: schema registry, RBAC, audit logging, retention, and compliance.
  • Deep expertise with Terraform and the Confluent Terraform provider; adherence to Infrastructure‑as‑Code (IaC) methodology and automation.
  • Practical experience designing and managing Harness CI/CD pipelines (or other similar tools) for automated deployment and configuration management of Flink jobs.
  • Advanced knowledge of GCP networking, including Private Service Connect (PSC), DNS, Firewalls, and enterprise security.
  • Track record in implementing cloud‑native monitoring and observability solutions; troubleshooting, Flink performance tuning, and incident response.
  • Thorough experience with Disaster Recovery (DR), High Availability (HA) strategies, backup/restore, and multi‑region design.
  • Practical experience with cost optimization, resource monitoring, and right‑sizing specifically for Flink and Kafka resources in Confluent Cloud.
  • Strong abilities in schema management, version compatibility, and data governance.
  • Demonstrated capability in capacity planning, partitioning, and scaling high‑throughput streaming architectures.
  • Experienced in Agile/Dev Ops methodologies.
  • Experience providing hands‑on production support for mission‑critical streaming platforms.
#J-18808-Ljbffr
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary