API Reference

Follow this guide on how to integrate CogCache using Python.

1️⃣ Create a CogCache account

If you don't have a CogCache account you can create a CogCache account through the Microsoft Azure store. You can find the CogCache listing here.


2️⃣ Generate a CogCache API key

To authenticate the CogCache Proxy API, you need a CogCache API key. You can easily generate an API key during onboarding. Alternatively, you can go to the Keys page to generate it there.


3️⃣ Integrate CogCache with your setup

Integration steps:

  1. Set the base_urland use a value of https://proxy-api.cogcache.com/v1/.
  2. Add the CogCache authorization header and set the CogCache API key as value.
  3. Since you're not using your own LLM also set the value of the api_key to the CogCache API key.
  4. Choose the right model for you from this table by setting the COGCACHE_LLM_MODEL value.
from openai import OpenAI

COGCACHE_LLM_MODEL = ""  # the model of choice
COGCACHE_API_KEY = ""  # the generated CogCache API key

client = OpenAI(
    base_url = "https://proxy-api.cogcache.com/v1/",
    api_key = COGCACHE_API_KEY, # this is not needed here, if it's already set via environment variables
    default_headers = {
        "Authorization": f"Bearer {COGCACHE_API_KEY}",
    },
)

response = client.chat.completions.create(
    model = COGCACHE_LLM_MODEL,
    stream = True,
    messages = [
        {
            "role": "system",
            "content": "Assistant is a large language model trained by OpenAI.",
        },
        {
          	"role": "user", 
          	"content": "Write a blog post about Generative AI"
        },
    ],
)

for chunk in response:
    print(chunk)