ð¤ ⢠Run LLMs on your laptop, entirely offline ð ⢠Chat with your local documents (new in 0.3) ð¾ ⢠Use models through the in-app Chat UI or an OpenAI compatible local server ð ⢠Download any compatible model files from Hugging Face ð¤ repositories ð ⢠Discover new & noteworthy LLMs right inside the app's Discover page LM Studio supports any GGUF Llama, Mistral, Phi, Gemma, StarCoder, etc model

{{#tags}}- {{label}}
{{/tags}}