LangChain
This documentation describes the integration of MindsDB with LangChain, a framework for developing applications powered by language models. The integration allows for the deployment of LangChain models within MindsDB, providing the models with access to data from various data sources.
Prerequisites
Before proceeding, ensure the following prerequisites are met:
- Install MindsDB locally via Docker or Docker Desktop.
- To use LangChain within MindsDB, install the required dependencies following this instruction.
- Obtain the API key for a selected model provider that you want to use through LangChain.
Available models include the following:
- OpenAI (how to get the API key)
- Anthropic (how to get the API key)
- Anyscale (how to get the API key)
- Google (how to get the API key)
- Ollama (how to download Ollama)
- LiteLLM (use the API key of the model used via LiteLLM)
- MindsDB (use any model created within MindsDB)
Setup
Create an AI engine from the LangChain handler.
Create a model using langchain_engine
as an engine and a selected model provider.
This handler supports tracing features for LangChain via LangFuse. To use it, provide the following parameters in the USING
clause:
langfuse_host
,langfuse_public_key
,langfuse_secret_key
.
There are three different tools utilized by this agent:
- MindsDB is the internal MindsDB executor.
- Metadata fetches the metadata information for the available tables.
- Write is able to write agent responses into a MindsDB data source.
Each tool exposes the internal MindsDB executor in a different way to perform its tasks, effectively enabling the agent model to read from (and potentially write to) data sources or models available in the active MindsDB project.
Create a conversational model using langchain_engine
as an engine and a selected model provider.
Usage
The following usage examples utilize langchain_engine
to create a model with the CREATE MODEL
statement.
Create a model that will be used to ask questions.
Ask questions.
Next Steps
Go to the Use Cases section to see more examples.