Kafka Schema Management with GitOps

1-day Hands-on Workshop

Upcoming Session: 18.02.2026 Price: 30$

About This Workshop

Schema Management is a critical aspect of Data Streaming solutions. It provides efficiency, predictability, and enables sophisticated integrations built on top of your data streams.

Many teams resign from using Schemas after initial bad experiences. Some perceive it as complex and introducing significant overhead. This workshop is designed to challenge that perception.

It will give you a deep understanding of the topic, ensuring you're able to build a robust Schema Management setup that results in a smooth and intuitive development workflow - one that new engineers can grasp quickly and start using without friction.

Target Audience

  • Application Developers
  • Platform Engineers
  • Architects

What You'll Learn

Module 1: Introduction to Schema Management

You'll understand how Schema Management brings value in Kafka and when it is too early to introduce it. You'll learn about Confluent Schema Registry API that's implemented by most available solutions and its core concepts - Subjects, Schema Evolution, Subject Compatibility Types and more.

Module 2: Serialization

In this module you'll learn about object serialization. We'll focus on Avro, but briefly introduce JSON Schema and Protobuf as well.

Hands-on exercises

You'll generate Java classes from Avro Schemas using Gradle and avro-tools and serialize/deserialize Java objects using different Schemas.

Module 3: Using Schemas in a Kafka Client

We'll cover configuration, error handling and integration testing a Spring Boot application that's producing and consuming messages.

Hands-on exercises

You'll configure and run a Spring Boot application connected to a dockerized Kafka environment that's consuming and producing Kafka messages using Schemas.

Module 4: Schema Management with GitOps

In this module you'll learn how to manage Schemas with GitOps. We'll cover different GitOps tools and problems that may arise.

Hands-on exercises

We'll build a complete GitOps workflow.

You'll set up a dockerized Kafka environment, dedicated schema repository where you declare Topics and Schemas and deploy them using Jikkou that builds a Schema library and deploys it to Maven repository.

Your Spring Boot app will use this library and use the generated Java classes to publish and consume Kafka messages. You'll evolve Schemas assigned to your Topics and migrate your app to the new Schema version.

Detailed workshop curriculum on GitHub →.

Open Workshop on 18.02.2026

Time: 9:00-16:00 CET

Price: 30$ per participant

Max: 15 participants

Format: Remote

This is the pilot open edition of this workshop. It's intended for promotional purposes and collecting feedback.

Participants will be working locally, so a week before the course you'll get access to a repository where you can validate that your environment is ready.

Tech stack

  • Java 21
  • Docker
  • Gradle
  • Jikkou
  • Confluent Schema Registry Community
  • Avro

Prerequisites

Participants should have basic Java and Docker skills and basic Kafka understanding.

The coding exercises aren't complicated - they are designed to give you a high level understanding and practical examples of the covered problems.

In the exercises we'll focus on how different components integrate together rather than advanced features of used technologies.