Essentials
LLM Integration
Integrate any LLM with mcp_use through LangChain
LLM Integration Guide
mcp_use supports integration with any Language Learning Model (LLM) that is compatible with LangChain. This guide covers how to use different LLM providers with mcp_use and emphasizes the flexibility to use any LangChain-supported model.
Universal LLM Support
mcp_use leverages LangChain’s architecture to support any LLM that implements the LangChain interface. This means you can use virtually any model from any provider, including:
- OpenAI models (GPT-4, GPT-3.5, etc.)
- Anthropic models (Claude)
- Google models (Gemini)
- Mistral models
- Groq models
- Llama models
- Cohere models
- Open source models (via LlamaCpp, HuggingFace, etc.)
- Custom or self-hosted models
- Any other model with a LangChain integration
Read more at https://python.langchain.com/docs/integrations/chat/