Kafka-ui supports multiple ways to serialize/deserialize data.
Int32, Int64, UInt32, UInt64
Big-endian 4/8 bytes representation of signed/unsigned integers.
Base64
Base64 (RFC4648) binary data representation. Can be useful in case if the actual data is not important, but exactly the same (byte-wise) key/value should be send.
Hex
Hexadecimal binary data representation. Bytes delimiter and case can be configured.
Class name: io.kafbat.ui.serdes.builtin.ProtobufFileSerde
Sample configuration:
kafka:
clusters:
- name: Cluster1
# Other Cluster configuration omitted ...
serde:
- name: ProtobufFile
properties:
# protobufFilesDir specifies root location for proto files (will be scanned recursively)
# NOTE: if 'protobufFilesDir' specified, then 'protobufFile' and 'protobufFiles' settings will be ignored
protobufFilesDir: "/path/to/my-protobufs"
# (DEPRECATED) protobufFile is the path to the protobuf schema. (deprecated: please use "protobufFiles")
protobufFile: path/to/my.proto
# (DEPRECATED) protobufFiles is the location of one or more protobuf schemas
protobufFiles:
- /path/to/my-protobufs/my.proto
- /path/to/my-protobufs/another.proto
# protobufMessageName is the default protobuf type that is used to deserialize
# the message's value if the topic is not found in protobufMessageNameByTopic.
protobufMessageName: my.DefaultValType
# default protobuf type that is used for KEY serialization/deserialization
# optional
protobufMessageNameForKey: my.Type1
# mapping of topic names to protobuf types, that will be used for KEYS serialization/deserialization
# optional
protobufMessageNameForKeyByTopic:
topic1: my.KeyType1
topic2: my.KeyType2
# default protobuf type that is used for VALUE serialization/deserialization
# optional, if not set - first type in file will be used as default
protobufMessageName: my.Type1
# mapping of topic names to protobuf types, that will be used for VALUES serialization/deserialization
# optional
protobufMessageNameByTopic:
topic1: my.Type1
"topic.2": my.Type2
ProtobufRawDecoder
Deserialize-only serde. Decodes protobuf payload without a predefined schema (like protoc --decode_raw command).
SchemaRegistry
SchemaRegistry serde is automatically configured if schema registry properties set on cluster level. But you can add new SchemaRegistry-typed serdes that will connect to another schema-registry instance.
Class name: io.kafbat.ui.serdes.builtin.sr.SchemaRegistrySerde
Sample configuration:
kafka:
clusters:
- name: Cluster1
# this url will be used by "SchemaRegistry" by default
schemaRegistry: http://main-schema-registry:8081
serde:
- name: AnotherSchemaRegistry
className: io.kafbat.ui.serdes.builtin.sr.SchemaRegistrySerde
properties:
url: http://another-schema-registry:8081
# auth properties, optional
username: nameForAuth
password: P@ssW0RdForAuth
# and also add another SchemaRegistry serde
- name: ThirdSchemaRegistry
className: io.kafbat.ui.serdes.builtin.sr.SchemaRegistrySerde
properties:
url: http://another-yet-schema-registry:8081
Setting serdes for specific topics
You can specify preferable serde for topics key/value. This serde will be chosen by default in UI on topic's view/produce pages. To do so, set topicValuesPattern/topicValuesPattern properties for the selected serde. Kafka-ui will choose a first serde that matches specified pattern.
If selected serde couldn't be applied (exception was thrown), then fallback (String serde with UTF-8 encoding) serde will be applied. Such messages will be specially highlighted in UI.
Custom pluggable serde registration
You can implement your own serde and register it in kafbat-ui application. To do so:
Add serde-api dependency (should be downloadable via maven central)
Implement io.kafbat.ui.serde.api.Serde interface. See javadoc for implementation requirements.
Pack your serde into uber jar, or provide directory with no-dependency jar and it's dependencies jars