Overview
The Proxy API is the API that allows proxying of the Azure OpenAI deployment.
To get started you only have to replace the Azure OpenAI LLM call with a call to the proxy. If no answer found in the cache, the proxy will then forward the request to the LLM and return the response to the user.
Integration guides
CogCache is designed from the very beginning to seamlessly work with your existing setup. Choose an integration guide below and follow the steps:
Headers
See this guide on Proxy API request and response headers.