Class ConversationTokenBufferMemory

Class that represents a conversation chat memory with a token buffer. It extends the BaseChatMemory class and implements the ConversationTokenBufferMemoryInput interface.

Example

const memory = new ConversationTokenBufferMemory({
llm: new ChatOpenAI({}),
maxTokenLimit: 10,
});

// Save conversation context
await memory.saveContext({ input: "hi" }, { output: "whats up" });
await memory.saveContext({ input: "not much you" }, { output: "not much" });

// Load memory variables
const result = await memory.loadMemoryVariables({});
console.log(result);

Hierarchy

Implements

Constructors

Properties

aiPrefix: string = "AI"
chatHistory: BaseChatMessageHistory
humanPrefix: string = "Human"
llm: BaseLanguageModel<any, BaseLanguageModelCallOptions>
maxTokenLimit: number = 2000
memoryKey: string = "history"
returnMessages: boolean = false
inputKey?: string
outputKey?: string

Accessors

  • get memoryKeys(): string[]
  • Returns string[]

Methods

  • Method to clear the chat history.

    Returns Promise<void>

    Promise that resolves when the chat history has been cleared.

  • Loads the memory variables. It takes an InputValues object as a parameter and returns a Promise that resolves with a MemoryVariables object.

    Parameters

    • _values: InputValues

      InputValues object.

    Returns Promise<MemoryVariables>

    A Promise that resolves with a MemoryVariables object.

  • Saves the context from this conversation to buffer. If the amount of tokens required to save the buffer exceeds MAX_TOKEN_LIMIT, prune it.

    Parameters

    • inputValues: InputValues
    • outputValues: OutputValues

    Returns Promise<void>

Generated using TypeDoc