Follow this guide on how to integrate CogCache using LangChain Python.
1️⃣ Create a CogCache account
If you don't have a CogCache account you can create a CogCache account through the Microsoft Azure store. You can find the CogCache listing here.
2️⃣ Generate a CogCache API key
To authenticate the CogCache Proxy API, you need a CogCache API key. You can easily generate an API key during onboarding. Alternatively, you can go to the Keys page to generate it there.
3️⃣ Integrate CogCache with your setup
Integration steps:
- Set the
base_url
and use a value ofhttps://proxy-api.cogcache.com/v1/
. - Add the CogCache authorization header and set the CogCache API key as value.
- Since you're not using your own LLM also set the value of the
openai_api_key
to the CogCache API key. - Choose the right model for you from this table by setting the
COGCACHE_LLM_MODEL
value.
from langchain_openai import ChatOpenAI
from langchain.schema import HumanMessage, SystemMessage
COGCACHE_LLM_MODEL = "" # the model of choice
COGCACHE_API_KEY = "" # the generated CogCache API key
model = ChatOpenAI(
base_url = "https://proxy-api.cogcache.com/v1/",
model = COGCACHE_LLM_MODEL,
openai_api_key = COGCACHE_API_KEY,
default_headers = {
"Authorization": f"Bearer {COGCACHE_API_KEY}"
},
)
response = model.stream(
[
SystemMessage(content="Assistant is a large language model trained by OpenAI."),
HumanMessage(content="Write a blog post about Generative AI"),
],
)
for chunk in response:
print(chunk.content)