AI Providers Setup
This section provides detailed configuration guides to help you set up various AI providers and their respective APIs and credentials in LibreChat.
Endpoints Configuration
The term “Endpoints” refers to the AI provider, configuration, or API that you need to set up and integrate with LibreChat. Each endpoint has its own configuration process, which may involve obtaining API keys, credentials, or following specific setup instructions.
The following guides are available to help you configure different endpoints:
- AWS Bedrock
- Setup AWS Bedrock integration
- Anthropic
- Integrate Anthropic AI models
- OpenAI
- Set up OpenAI API integration
- Google
- Configure Google AI services
- Assistants
- Enable and configure OpenAI’s Assistants
Custom Endpoint Configuration
The librechat.yaml Configuration Guides provides detailed instructions on how to configure custom endpoints within LibreChat.
In addition to the pre-configured endpoints, the librechat config file allows you to add and configure custom endpoints. This includes integrating with AI providers like Ollama, Mistral AI, Openrouter, and a multitude of other third-party services.
By following these configuration guides, you can seamlessly integrate various AI providers, unlock their capabilities, and enhance your LibreChat experience with the power of multiple AI models and services.