7 unstable releases

new 0.5.2 Nov 30, 2024
0.5.1 Sep 2, 2024
0.5.0 Apr 24, 2024
0.3.0 Jul 11, 2023
0.1.0 Mar 2, 2023

#451 in Concurrency

Download history 475/week @ 2024-08-10 456/week @ 2024-08-17 612/week @ 2024-08-24 574/week @ 2024-08-31 489/week @ 2024-09-07 545/week @ 2024-09-14 565/week @ 2024-09-21 604/week @ 2024-09-28 674/week @ 2024-10-05 455/week @ 2024-10-12 457/week @ 2024-10-19 508/week @ 2024-10-26 636/week @ 2024-11-02 498/week @ 2024-11-09 680/week @ 2024-11-16 640/week @ 2024-11-23

2,535 downloads per month
Used in 7 crates (via sea-streamer)

MIT/Apache

230KB
5K SLoC

sea-streamer-socket: Backend-agnostic Socket API

Akin to how SeaORM allows you to build applications for different databases, SeaStreamer allows you to build stream processors for different streaming servers.

While the sea-streamer-types crate provides a nice trait-based abstraction, this crates provides a concrete-type API, so that your program can stream from/to any SeaStreamer backend selected by the user on runtime.

This allows you to do neat things, like generating data locally and then stream them to Redis / Kafka. Or in the other way, sink data from server to work on them locally. All without recompiling the stream processor.

If you only ever work with one backend, feel free to depend on sea-streamer-redis / sea-streamer-kafka directly.

A small number of cli programs are provided for demonstration. Let's set them up first:

# The `clock` program generate messages in the form of `{ "tick": N }`
alias clock='cargo run --package sea-streamer-stdio  --features=executables --bin clock'
# The `relay` program redirect messages from `input` to `output`
alias relay='cargo run --package sea-streamer-socket --features=executables,backend-kafka,backend-redis --bin relay'

Here is how to stream from Stdio ➡️ Redis / Kafka. We generate messages using clock and then pipe it to relay, which then streams to Redis / Kafka:

# Stdio -> Redis
clock -- --stream clock --interval 1s | \
relay -- --input stdio:///clock --output redis://localhost:6379/clock
# Stdio -> Kafka
clock -- --stream clock --interval 1s | \
relay -- --input stdio:///clock --output kafka://localhost:9092/clock

Here is how to stream between Redis ↔️ Kafka:

# Redis -> Kafka
relay -- --input redis://localhost:6379/clock --output kafka://localhost:9092/clock
# Kafka -> Redis
relay -- --input kafka://localhost:9092/clock --output redis://localhost:6379/clock

Here is how to replay the stream from Kafka / Redis:

relay -- --input redis://localhost:6379/clock --output stdio:///clock --offset start
relay -- --input kafka://localhost:9092/clock --output stdio:///clock --offset start

Dependencies

~3–18MB
~247K SLoC