Class for managing long-term memory in Large Language Model (LLM) applications. It provides a way to persist and retrieve relevant documents from a vector store database, which can be useful for maintaining conversation history or other types of memory in an LLM application.

Example

const vectorStore = new MemoryVectorStore(new OpenAIEmbeddings());
const memory = new VectorStoreRetrieverMemory({
vectorStoreRetriever: vectorStore.asRetriever(1),
memoryKey: "history",
});

// Saving context to memory
await memory.saveContext(
{ input: "My favorite food is pizza" },
{ output: "thats good to know" },
);
await memory.saveContext(
{ input: "My favorite sport is soccer" },
{ output: "..." },
);
await memory.saveContext({ input: "I don't the Celtics" }, { output: "ok" });

// Loading memory variables
console.log(
await memory.loadMemoryVariables({ prompt: "what sport should i watch?" }),
);

Hierarchy (view full)

Implements

Constructors

Properties

memoryKey: string
returnDocs: boolean
vectorStoreRetriever: VectorStoreRetrieverInterface
inputKey?: string
metadata?: Metadata | MetadataFunction

Metadata to be added to the document when saving context.

Accessors

Methods

  • Method to load memory variables. It uses the vectorStoreRetriever to get relevant documents based on the query obtained from the input values.

    Parameters

    • values: InputValues

      An InputValues object.

    Returns Promise<MemoryVariables>

    A Promise that resolves to a MemoryVariables object.

  • Method to save context. It constructs a document from the input and output values (excluding the memory key) and adds it to the vector store database using the vectorStoreRetriever.

    Parameters

    • inputValues: InputValues

      An InputValues object.

    • outputValues: OutputValues

      An OutputValues object.

    Returns Promise<void>

    A Promise that resolves to void.

Generated using TypeDoc