Class for managing and storing previous chat messages. It extends the BaseChatMemory class and implements the BufferWindowMemoryInput interface. This class is stateful and stores messages in a buffer. When called in a chain, it returns all of the messages it has stored.

const prompt =
PromptTemplate.fromTemplate(`The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.
Current conversation:
{chat_history}
Human: {input}
AI:`);

const chain = new LLMChain({
llm: new ChatOpenAI({ temperature: 0.9 }),
prompt,
memory: new BufferWindowMemory({ memoryKey: "chat_history", k: 1 }),
});

// Example of initiating a conversation with the AI
const res1 = await chain.call({ input: "Hi! I'm Jim." });
console.log({ res1 });

// Example of following up with another question
const res2 = await chain.call({ input: "What's my name?" });
console.log({ res2 });

Hierarchy (view full)

Implements

Constructors

Properties

aiPrefix: string = "AI"
chatHistory: BaseChatMessageHistory
humanPrefix: string = "Human"
inputKey?: string
k: number = 5
memoryKey: string = "history"
outputKey?: string
returnMessages: boolean = false

Accessors

Methods

  • Method to load the memory variables. Retrieves the chat messages from the history, slices the last 'k' messages, and stores them in the memory under the memoryKey. If the returnMessages property is set to true, the method returns the messages as they are. Otherwise, it returns a string representation of the messages.

    Parameters

    Returns Promise<MemoryVariables>

    Promise that resolves to a MemoryVariables object.