A cache that uses Upstash as the backing store. See https://docs.upstash.com/redis.

const cache = new UpstashRedisCache({
config: {
url: "UPSTASH_REDIS_REST_URL",
token: "UPSTASH_REDIS_REST_TOKEN",
},
});
// Initialize the OpenAI model with Upstash Redis cache for caching responses
const model = new ChatOpenAI({
cache,
});
await model.invoke("How are you today?");
const cachedValues = await cache.lookup("How are you today?", "llmKey");

Hierarchy

  • BaseCache
    • UpstashRedisCache

Constructors

Methods

Constructors

Methods

  • Lookup LLM generations in cache by prompt and associated LLM key.

    Parameters

    • prompt: string
    • llmKey: string

    Returns Promise<null | Generation[]>

  • Update the cache with the given generations.

    Note this overwrites any existing generations for the given prompt and LLM key.

    Parameters

    • prompt: string
    • llmKey: string
    • value: Generation[]

    Returns Promise<void>