Class ConversationalRetrievalQAChain

Class for conducting conversational question-answering tasks with a retrieval component. Extends the BaseChain class and implements the ConversationalRetrievalQAChainInput interface.

Example

const model = new ChatAnthropic({});

const text = fs.readFileSync("state_of_the_union.txt", "utf8");

const textSplitter = new RecursiveCharacterTextSplitter({ chunkSize: 1000 });
const docs = await textSplitter.createDocuments([text]);

const vectorStore = await HNSWLib.fromDocuments(docs, new OpenAIEmbeddings());

const chain = ConversationalRetrievalQAChain.fromLLM(
model,
vectorStore.asRetriever(),
);

const question = "What did the president say about Justice Breyer?";

const res = await chain.call({ question, chat_history: "" });
console.log(res);

const chatHistory = `${question}\n${res.text}`;
const followUpRes = await chain.call({
question: "Was that nice?",
chat_history: chatHistory,
});
console.log(followUpRes);

Hierarchy

Implements

Constructors

Properties

chatHistoryKey: string = "chat_history"
combineDocumentsChain: BaseChain<ChainValues, ChainValues>
inputKey: string = "question"
questionGeneratorChain: LLMChain<string, LLMType>
retriever: BaseRetriever
returnGeneratedQuestion: boolean = false
returnSourceDocuments: boolean = false
memory?: BaseMemory

Accessors

  • get inputKeys(): string[]
  • Returns string[]

  • get outputKeys(): string[]
  • Returns string[]

Methods

  • Call the chain on all inputs in the list

    Parameters

    • inputs: ChainValues[]
    • Optional config: (BaseCallbackConfig | Callbacks)[]

    Returns Promise<ChainValues[]>

  • Run the core logic of this chain and add to output if desired.

    Wraps _call and handles memory.

    Parameters

    • values: ChainValues & {
          signal?: AbortSignal;
          timeout?: number;
      }
    • Optional config: BaseCallbackConfig | Callbacks
    • Optional tags: string[]

      Deprecated

    Returns Promise<ChainValues>

  • Invoke the chain with the provided input and returns the output.

    Parameters

    • input: ChainValues

      Input values for the chain run.

    • Optional config: BaseCallbackConfig

      Optional configuration for the Runnable.

    Returns Promise<ChainValues>

    Promise that resolves with the output of the chain run.

  • Parameters

    • inputs: Record<string, unknown>
    • outputs: Record<string, unknown>
    • returnOnlyOutputs: boolean = false

    Returns Promise<Record<string, unknown>>

  • Parameters

    • input: any
    • Optional config: BaseCallbackConfig | Callbacks

    Returns Promise<string>

  • Static method to create a new ConversationalRetrievalQAChain from a BaseLanguageModel and a BaseRetriever.

    Parameters

    • llm: BaseLanguageModel<any, BaseLanguageModelCallOptions>

      BaseLanguageModel instance used to generate a new question.

    • retriever: BaseRetriever

      BaseRetriever instance used to retrieve relevant documents.

    • options: {
          outputKey?: string;
          qaChainOptions?: QAChainParams;
          qaTemplate?: string;
          questionGeneratorChainOptions?: {
              llm?: BaseLanguageModel<any, BaseLanguageModelCallOptions>;
              template?: string;
          };
          questionGeneratorTemplate?: string;
          returnSourceDocuments?: boolean;
      } & Omit<ConversationalRetrievalQAChainInput, "combineDocumentsChain" | "retriever" | "questionGeneratorChain"> = {}

    Returns ConversationalRetrievalQAChain

    A new instance of ConversationalRetrievalQAChain.

  • Static method to convert the chat history input into a formatted string.

    Parameters

    • chatHistory: string | BaseMessage[] | string[][]

      Chat history input which can be a string, an array of BaseMessage instances, or an array of string arrays.

    Returns string

    A formatted string representing the chat history.

Generated using TypeDoc