A simple and clear example for implement a chatbot with Bedrock + LangChain + Streamlit. Know-how and build whatever you want. Just install and run the code~ 🚀
pip install -r requirements.txt
streamlit run bedrock/bedrock_chatbot.py
Note: if you're going to use web search function, add your SERPAPI key and AWS region to bedrock/.env file~
Tips: you could modify retry mode in AWS profile config for model access in case rate limit
[default] max_attempts = 10 retry_mode = adaptive
We have significantly enhanced the RAG feature in our AI model. Now, it allows users to upload their own documents and index them either locally or on a server.
The uploaded documents are indexed and stored persistently on the chosen location. This means that the indexed documents can be reused anytime without needing to be re-uploaded or re-indexed.
Moreover, you can continually add more documents to the existing index, making the system increasingly robust and knowledgeable over time.
To use the enhanced RAG feature, select 'RAG' from the 'Options' dropdown in the chatbot interface, and follow the prompts to upload and index your documents.
We have added a new feature that allows the AI model to pull in information from a large corpus of documents, providing more detailed and accurate responses. This feature uses the RAG technique, which combines the benefits of extractive and abstractive summarization.
To use the RAG feature, select 'RAG' from the 'Options' dropdown in the chatbot interface.
To use the RAG (Retrieval-Augmented Generation) feature, you need to index your documents using the bedrock_indexer.py
script. This script creates a FAISS index from the documents in a directory.
Here's how to use it:
- Add your documents to the "documents" directory. These can be text files or other types of documents that you want the RAG model to use for information retrieval.
- Run the
bedrock_indexer.py
script:
python bedrock_indexer.py
Add Web Search (via SerpAPI) and role prompt option! 🎉🎉🎉
Thanks @madtank for adding PDF/CSV/PY file upload feature! 🎉🎉🎉
1. Add Dockfile for container enviroment and remove the packages installation! 🎉🎉🎉
You could build your own, and I've also uploaded a public container image at public.ecr.aws for you~
docker run -d -v $HOME/.aws/config:/root/.aws/config:ro -p 8502:8501 public.ecr.aws/shtian/bedrock-claude-3-langchain-streamlit:latest
2. NEW! Mistral Large on Bedrock Supported! 🎉🎉🎉
NEW! Claude 3 Haiku on Bedrock Supported! Let's Go Faster! 🎉🎉🎉
Install via the command:
pip install -r requirements.txt
NEW! Claude 3 Sonnet on Bedrock Supported! New Message API Plus Vision Multimodal Chat! 🎉🎉🎉
Add system prompt option.
NEW! Claude 3 Sonnet on Bedrock Supported! New Message API Plus Vision Multimodal Chat! 🎉🎉🎉
Install langchain from source, for new Bedrock API support.
Note: No need to hack in bedrock code! Just change the langchain_messages state of streamlit in the app code. Complete this code with the help of Claude itself :)
git clone https://github.com/langchain-ai/langchain.git
pip install -e langchain/libs/langchain
Then run the command:
streamlit run bedrock_chatbot_claude_3_sonnet_vision.py
Note: Some details like - smooth history catchup with new message api, support mulitple images in one chat, image keep in the thumbnail in one line, multimodal and text-only mixed chat, no some bump up after rerun and re-initialize, fix lots of format mismatch...
NEW! Claude 3 Sonnet on Bedrock Supported~ Message API Plus Vision Multimodal! 🎉🎉🎉
Extra action needed (till now) - install langchain from source.
Note: A little bit hack for streamlit conversation history format mismatch, and modify langchain community bedrock source code, no impact on BedrockChat invoke ~
git clone https://github.com/davidshtian/langchain.git
pip install -e langchain/libs/langchain
Then run the command:
streamlit run bedrock_chatbot_claude_3_sonnet_vision.py
NEW! Claude 3 Sonnet on Bedrock Supported~ Message API
Extra action needed (till now) - install langchain from source:
git clone https://github.com/langchain-ai/langchain.git
pip install -e langchain/libs/langchain
Note: Only text supported now, vision later!
Then run the command:
streamlit run bedrock_chatbot_claude_3_sonnet.py
The bot is equipped with chat history using ConversationBufferWindowMemory and StreamlitChatMessageHistory, and provided with both simple(batch) and streaming modes. Demo shown as below:
Bedrock_Chat_Fast.mp4
Streaming mode demo shown as below: