Lenses.io is the leader in offering Apache 2 licensed Kafka Connectors (Stream Reactor) since 2016.
Lenses offers the leading Developer Experience solution for engineers building real-time applications on any Apache Kafka (lenses.io). Subscribed customers are entitled to full 24x7 support for selected Kafka Connectors. This includes priority over feature requests and security incident SLAs. Email [email protected] for more information.
Speak to us on our Community Slack channel (Register at https://launchpass.com/lensesio) or ask the Community a question in our Ask Marios forum.
Under our standard support agreement, we provide assistance for the current major version as well as the preceding major version exclusively. For instance, if version 8.x is the current major release, we offer support for Lenses connectors versions within the 8.x and 7.x series.
Enterprise-level support commences from version 7.0 onwards.
Lenses prioritizes backporting fixes rather than incorporating new features, and this is done on a limited basis across select prior releases. For further clarification, please refer to our maintenance policy.
Should you reach out to our support team regarding issues encountered while using an unsupported version, we will direct you to this section of our policy page and encourage you to upgrade.
A series of next-generation Connectors are in active development. Give us your feedback of which connectors we should be working on or to to get the latest information, send us an email at [email protected]
A collection of components to build a real time ingestion pipeline.
- Kafka 2.8 -> 3.5 (Confluent 6.2 -> 7.5) - Stream Reactor 4.1.0+
- Kafka 3.1 (Confluent 7.1) - Stream Reactor 4.0.0 (Kafka 3.1 Build)
- Kafka 2.8 (Confluent 6.2) - Stream Reactor 4.0.0 (Kafka 2.8 Build)
- Kafka 2.5 (Confluent 5.5) - Stream reactor 2.0.0+
- Kafka 2.0 -> 2.4 (Confluent 5.4) - Stream reactor 1.2.7
In the next major release, Elasticsearch 6 support will be removed, to be replaced with OpenSearch and Elasticsearch 8 support.
The following connectors have been deprecated and are no longer included in future releases:
- Elasticsearch 6
- Kudu
- Hazelcast
- HBase
- Hive
- Pulsar
Please take a moment and read the documentation and make sure the software prerequisites are met!!
Connector | Type | Description | Docs |
---|---|---|---|
AWS S3 | Sink | Copy data from Kafka to AWS S3. | Docs |
AWS S3 | Source | Copy data from AWS S3 to Kafka. | Docs |
Azure Data Lake (Beta) | Sink | Copy data from Kafka to Azure Data Lake | Docs |
AzureDocumentDb | Sink | Copy data from Kafka and Azure Document Db. | Docs |
Cassandra | Source | Copy data from Cassandra to Kafka. | Docs |
*Cassandra | Sink | Certified DSE Cassandra, copy data from Kafka to Cassandra. | Docs |
Elastic 6 | Sink | Copy data from Kafka to Elastic Search 6.x w. tcp or http | Docs |
Elastic 7 | Sink | Copy data from Kafka to Elastic Search 7.x w. tcp or http | Docs |
FTP/HTTP | Source | Copy data from FTP/HTTP to Kafka. | Docs |
Google Cloud Storage (Beta) | Sink | Copy data from Kafka to Google Cloud Storage. | Docs |
Google Cloud Storage (Beta) | Source | Copy data from Google Cloud Storage to Kafka. | Docs |
HTTP (Beta) | Sink | Copy data from Kafka to HTTP. | Docs |
InfluxDb | Sink | Copy data from Kafka to InfluxDb. | Docs |
JMS | Source | Copy data from JMS topics/queues to Kafka. | Docs |
JMS | Sink | Copy data from Kafka to JMS. | Docs |
MongoDB | Sink | Copy data from Kafka to MongoDB. | Docs |
MQTT | Source | Copy data from MQTT to Kafka. | Docs |
MQTT | Sink | Copy data from Kafka to MQTT. | Docs |
Redis | Sink | Copy data from Kafka to Redis. | Docs |
Please see the Stream Reactor Release Notes at Lenses Documentation.
To build:
sbt clean compile
To test:
sbt test
To create assemblies:
sbt assembly
To build a particular project:
sbt "project cassandra" compile
To test a particular project:
sbt "project cassandra" test
To create a jar of a particular project:
sbt "project cassandra" assembly
If not already built, you must first build the connector archives:
sbt "project cassandra" assembly
sbt "project elastic6" assembly
sbt "project mongodb" assembly
sbt "project redis" assembly
To run the tests:
sbt e2e:test
For a detailed explanation of the Github workflow, please see our Github Actions Workflow Guide.
We'd love to accept your contributions! Please use GitHub pull requests: fork the repo, develop and test your code, semantically commit and submit a pull request. Thanks!