This package contains the official LangChain integration with Couchbase
The documentation and API Reference can be found on Github Pages.
pip install -U langchain-couchbaseCouchbaseQueryVectorStore class enables the usage of Couchbase for Vector Search using the Query and Indexing Service. It implements two different types of vector indices:
Note: CouchbaseQueryVectorStore requires Couchbase Server version 8.0 and above.
To use this in an application:
import getpass
# Constants for the connection
COUCHBASE_CONNECTION_STRING = getpass.getpass(
"Enter the connection string for the Couchbase cluster: "
)
DB_USERNAME = getpass.getpass("Enter the username for the Couchbase cluster: ")
DB_PASSWORD = getpass.getpass("Enter the password for the Couchbase cluster: ")
# Create Couchbase connection object
from datetime import timedelta
from couchbase.auth import PasswordAuthenticator
from couchbase.cluster import Cluster
from couchbase.options import ClusterOptions
auth = PasswordAuthenticator(DB_USERNAME, DB_PASSWORD)
options = ClusterOptions(auth)
cluster = Cluster(COUCHBASE_CONNECTION_STRING, options)
# Wait until the cluster is ready for use.
cluster.wait_until_ready(timedelta(seconds=5))
from langchain_couchbase import CouchbaseQueryVectorStore
from langchain_couchbase.vectorstores import DistanceStrategy
vector_store = CouchbaseQueryVectorStore(
cluster=cluster,
bucket_name=BUCKET_NAME,
scope_name=SCOPE_NAME,
collection_name=COLLECTION_NAME,
embedding=my_embeddings,
distance_metric=DistanceStrategy.DOT
)See a usage example
CouchbaseSearchVectorStore class enables the usage of Couchbase for Vector Search using the Search Service.
Note: CouchbaseSearchVectorStore requires Couchbase Server version 7.6 and above.
from langchain_couchbase import CouchbaseSearchVectorStoreTo use this in an application:
import getpass
# Constants for the connection
COUCHBASE_CONNECTION_STRING = getpass.getpass(
"Enter the connection string for the Couchbase cluster: "
)
DB_USERNAME = getpass.getpass("Enter the username for the Couchbase cluster: ")
DB_PASSWORD = getpass.getpass("Enter the password for the Couchbase cluster: ")
# Create Couchbase connection object
from datetime import timedelta
from couchbase.auth import PasswordAuthenticator
from couchbase.cluster import Cluster
from couchbase.options import ClusterOptions
auth = PasswordAuthenticator(DB_USERNAME, DB_PASSWORD)
options = ClusterOptions(auth)
cluster = Cluster(COUCHBASE_CONNECTION_STRING, options)
# Wait until the cluster is ready for use.
cluster.wait_until_ready(timedelta(seconds=5))
from langchain_couchbase import CouchbaseSearchVectorStore
vector_store = CouchbaseSearchVectorStore(
cluster=cluster,
bucket_name=BUCKET_NAME,
scope_name=SCOPE_NAME,
collection_name=COLLECTION_NAME,
embedding=my_embeddings,
index_name=SEARCH_INDEX_NAME,
)See a usage example
Use Couchbase as a cache for prompts and responses.
See a usage example.
To import this cache:
from langchain_couchbase.cache import CouchbaseCacheTo use this cache with your LLMs:
from langchain_core.globals import set_llm_cache
cluster = couchbase_cluster_connection_object
set_llm_cache(
CouchbaseCache(
cluster=cluster,
bucket_name=BUCKET_NAME,
scope_name=SCOPE_NAME,
collection_name=COLLECTION_NAME,
)
)Semantic caching allows users to retrieve cached prompts based on the semantic similarity between the user input and previously cached inputs. Under the hood it uses Couchbase as both a cache and a vectorstore. The CouchbaseSemanticCache needs a Search Index defined to work. Please look at the usage example on how to set up the index.
See a usage example.
To import this cache:
from langchain_couchbase.cache import CouchbaseSemanticCacheTo use this cache with your LLMs:
from langchain_core.globals import set_llm_cache
# use any embedding provider...
from langchain_openai.Embeddings import OpenAIEmbeddings
embeddings = OpenAIEmbeddings()
cluster = couchbase_cluster_connection_object
set_llm_cache(
CouchbaseSemanticCache(
cluster=cluster,
embedding = embeddings,
bucket_name=BUCKET_NAME,
scope_name=SCOPE_NAME,
collection_name=COLLECTION_NAME,
index_name=INDEX_NAME,
)
)Use Couchbase as the storage for your chat messages.
See a usage example.
To use the chat message history in your applications:
from langchain_couchbase.chat_message_histories import CouchbaseChatMessageHistory
cluster = couchbase_cluster_connection_object
message_history = CouchbaseChatMessageHistory(
cluster=cluster,
bucket_name=BUCKET_NAME,
scope_name=SCOPE_NAME,
collection_name=COLLECTION_NAME,
session_id="test-session",
)
message_history.add_user_message("hi!")Documentation
To generate the documentation locally, follow these steps:
- Ensure you have the project installed in your environment:
pip install -e . # Install in development mode- Install the required documentation dependencies:
pip install sphinx sphinx-rtd-theme tomli- Navigate to the docs directory:
cd docs- Ensure the build directory exists:
mkdir -p source/build- Build the HTML documentation:
make html- The generated documentation will be available in the
docs/build/htmldirectory. You can openindex.htmlin your browser to view it:
# On macOS
open build/html/index.html
# On Linux
xdg-open build/html/index.html
# On Windows
start build/html/index.html- To clean the build directory before rebuilding:
make clean html- To check for broken links in the documentation:
make linkcheck- To generate a PDF version of the documentation (requires LaTeX):
make latexpdf- For help on available make commands:
make help- If you encounter errors about missing modules, ensure you have installed the project in your environment.
- If Sphinx can't find your package modules, verify your
conf.pyhas the correct path configuration. - For sphinx-specific errors, refer to the Sphinx documentation.
- If you see an error about missing
tomlimodule, make sure you've installed it withpip install tomli.
We truly appreciate your interest in this project!
This project is community-maintained, which means it's not officially supported by our support team.
If you need help, have found a bug, or want to contribute improvements, the best place to do that is right here β by opening a GitHub issue.
Our support portal is unable to assist with requests related to this project, so we kindly ask that all inquiries stay within GitHub.
Your collaboration helps us all move forward together β thank you!