1-day Hands-on Workshop
Schema Management is a critical aspect of Data Streaming solutions. It provides efficiency, predictability, and enables sophisticated integrations built on top of your data streams.
Many teams resign from using Schemas after initial bad experiences. Some perceive it as complex and introducing significant overhead. This workshop is designed to challenge that perception.
It will give you a deep understanding of the topic, ensuring you're able to build a robust Schema Management setup that results in a smooth and intuitive development workflow - one that new engineers can grasp quickly and start using without friction.
You'll understand how Schema Management brings value in Kafka and when it is too early to introduce it. You'll learn about Confluent Schema Registry API that's implemented by most available solutions and its core concepts - Subjects, Schema Evolution, Subject Compatibility Types and more.
In this module you'll learn about object serialization. We'll focus on Avro, but briefly introduce JSON Schema and Protobuf as well.
You'll generate Java classes from Avro Schemas using Gradle and avro-tools and serialize/deserialize Java objects using different Schemas.
We'll cover configuration, error handling and integration testing a Spring Boot application that's producing and consuming messages.
You'll configure and run a Spring Boot application connected to a dockerized Kafka environment that's consuming and producing Kafka messages using Schemas.
In this module you'll learn how to manage Schemas with GitOps. We'll cover different GitOps tools and problems that may arise.
We'll build a complete GitOps workflow.
You'll set up a dockerized Kafka environment, dedicated schema repository where you declare Topics and Schemas and deploy them using Jikkou that builds a Schema library and deploys it to Maven repository.
Your Spring Boot app will use this library and use the generated Java classes to publish and consume Kafka messages. You'll evolve Schemas assigned to your Topics and migrate your app to the new Schema version.
Detailed workshop curriculum on GitHub →.
This is the pilot open edition of this workshop. It's intended for promotional purposes and collecting feedback.
Participants will be working locally, so a week before the course you'll get access to a repository where you can validate that your environment is ready.
Participants should have basic Java and Docker skills and basic Kafka understanding.
The coding exercises aren't complicated - they are designed to give you a high level understanding and practical examples of the covered problems.
In the exercises we'll focus on how different components integrate together rather than advanced features of used technologies.