Skip to content

kubinio123/hands-on-kafka-streams

Repository files navigation

Read the blog

This code was created for the purpose of this blog post, I encourage you to read it first :)

Running the example

I assume that sbt is installed directly on your OS.

Defined docker-compose starts up a zookeeper, kafka and schema-registry instance as well as tools container with confluent-cli-tools installed. Scripts from kafka-cli-scripts directory are mounted to the tools container.

Running kafka

Run start-kafka.sh script. It will start up docker network and create topics as defined in kafka-cli-scripts/create-topics.sh.

Running apps

Directory contains 4 sbt projects each being independent app:

  • avro - generates and registers Avro schemas for data used as keys/values in created topics
  • car-data-producer - produces random data to kafka topics
  • driver-notifier - kafka streams application which aggregates data from several topics, processes and produces to driver-notifications
  • car-data-consumer - can consume data from any of created kafka topics (driver-notifications by default)

Please also add this two entries to your etc/hosts, since apps above reach kafka using docker hostnames:

# streams-app host entries
127.0.0.1 kafka
127.0.0.1 schema-registry

With sbt each app can be started in separate shell instance with the following command: sbt "project <name>" "run", for an ex. sbt "project carDataProducer" "run".

Please run avro and carDataProducer first.

If you are Intellij user then most convenient option will be to use IDE for running apps, since you can easily tweak, debug and experiment with the code.

Cleanup

Run stop-kafka.sh script.

About

Just me playing with kafka and kafka-streams

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published