🚁🚀基于Flink实现的商品实时推荐系统。flink统计商品热度,放入redis缓存,分析日志信息,将画像标签和实时记录放入Hbase。在用户发起推荐请求后,根据用户画像重排序热度榜,并结合协同过滤和标签两个推荐模块为新生成的榜单的每一个产品添加关联产品,最后返回新的用户列表。
-
Updated
Feb 4, 2024 - Java
🚁🚀基于Flink实现的商品实时推荐系统。flink统计商品热度,放入redis缓存,分析日志信息,将画像标签和实时记录放入Hbase。在用户发起推荐请求后,根据用户画像重排序热度榜,并结合协同过滤和标签两个推荐模块为新生成的榜单的每一个产品添加关联产品,最后返回新的用户列表。
该仓库专注于让读者秒懂Flink组件,包含Flink实战代码和文档、200个Flink教程知识点,Flink Datastream、Flink Table、Flink Window、Flink State、Flink Checkpoint、Flink Metrics、Flink Memory、Flink on standalone /yarn/k8s、Flink SQL、Flink CEP、Flink CDC、Flink UDF、PyFlink、Flink新特性、Flink Partition、Flink Memory等知识点。详细链接请看:https ://mp.weixin.qq.com/mp /appmsgalbum?__biz=Mzg5NDY3NzIwMA==&action=g…
Low-code tool for automating actions on real time data | Stream processing for the users.
《Kafka技术内幕》代码
💥 🚀 封装sparkstreaming动态调节batch time(有数据就执行计算);🚀 支持运行过程中增删topic;🚀 封装sparkstreaming 1.6 - kafka 010 用以支持 SSL。
Apache Flink Guide
A Flink applcation that demonstrates reading and writing to/from Apache Kafka with Apache Flink
A simple request response cycle using Websockets, Eclipse Vert-x server, Apache Kafka, Apache Flink.
基于Flink流处理的动态实时亿级电商全端用户画像系统
DeserializationSchema compatible with Confluent's KafkaAvroDecoder
Building POC of streaming pipeline with Flink, Kafka, Pinot
This repository accompanies the article "Build a data ingestion pipeline using Kafka, Flink, and CrateDB" and the "CrateDB Community Day #2".
Process streaming data from Kafka with Flink and store it in ElasticSearch
Project to compare Apache Spark Streaming vs Apache Flink.
Examples of Apache Flink® applications showcasing the DataStream API, Table API in Java and Python, and Flink SQL, featuring AWS, GitHub, Terraform, Streamlit, and Apache Iceberg.
This code helps making a real time alert platform for IoT etc.
This is a basic example to show CEP in flink with kafka
Stream Processing of website click data using Kafka and monitored and visualised using Prometheus and Grafana
Add a description, image, and links to the flink-kafka topic page so that developers can more easily learn about it.
To associate your repository with the flink-kafka topic, visit your repo's landing page and select "manage topics."