API Reference

Overview

The Proxy API is the API that allows proxying of the Azure OpenAI deployment.

To get started you only have to replace the Azure OpenAI LLM call with a call to the proxy. If no answer found in the cache, the proxy will then forward the request to the LLM and return the response to the user.

Integration guides

CogCache is designed from the very beginning to seamlessly work with your existing setup. Choose an integration guide below and follow the steps:

Python
Node.js
LangChain
LangChain JS
cURL

Headers

See this guide on Proxy API request and response headers.

Error messages

Check out the full list of error messages