a self-hosted webui for 30+ generative ai
-
Updated
Nov 28, 2024 - Python
a self-hosted webui for 30+ generative ai
Setup and run a local LLM and Chatbot using consumer grade hardware.
Gradio based tool to run opensource LLM models directly from Huggingface
An open source, Gradio-based chatbot app that combines the best of retrieval augmented generation and prompt engineering into an intelligent assistant for modern professionals.
Local character AI chatbot with chroma vector store memory and some scripts to process documents for Chroma
Tool for test diferents large language models without code.
AgentX is an Open-source library that help people use LLMs on their own computers or help them to serve LLMs as easy as possible that support multi-backends like PyTorch, llama.cpp, Ollama and EasyDeL
A quick and optimized solution to manage llama based gguf quantized models, download gguf files, retreive messege formatting, add more models from hf repos and more. It's super easy to use and comes prepacked with best preconfigured open source models: dolphin phi-2 2.7b, mistral 7b v0.2, mixtral 8x7b v0.1, solar 10.7b and zephyr 3b
A financial chatbot powered by an LLM and retrieval-augmented generation.
Simple chat interface for local AI using llama-cpp-python and llama-cpp-agent
TAO71 I4.0 is an AI created by TAO71 in C# and Python.
UnOfficial Gradio Repo for ICML 2024 paper "Executable Code Actions Elicit Better LLM Agents" by Xingyao Wang, Yangyi Chen, Lifan Yuan, Yizhe Zhang, Yunzhu Li, Hao Peng, Heng Ji.
YouTube API implementation with Meta's Llama 2 to analyze comments and sentiments
Experimental interface environment for open source LLM, designed to democratize the use of AI. Powered by llama-cpp, llama-cpp-python and Gradio.
A Genshin Impact Question Answer Project supported by Qwen1.5-14B-Chat
This is the repository for the UI for the SeKernel_for_LLM module
Genshin Impact Character Chat Models tuned by Lora on LLM
llama-cpp-python(llama.cpp)で実行するGGUF形式のLLM用の簡易Webインタフェースです。
Add a description, image, and links to the llama-cpp-python topic page so that developers can more easily learn about it.
To associate your repository with the llama-cpp-python topic, visit your repo's landing page and select "manage topics."