Optional
memoryOptional
config: any[]Use .batch() instead. Will be removed in 0.2.0.
This feature is deprecated and will be removed in the future.
It is not recommended for use.
Call the chain on all inputs in the list
Optional
config: anyOptional
tags: string[]Use .invoke() instead. Will be removed in 0.2.0.
Run the core logic of this chain and add to output if desired.
Wraps _call and handles memory.
Invoke the chain with the provided input and returns the output.
Input values for the chain run.
Optional
config: anyOptional configuration for the Runnable.
Promise that resolves with the output of the chain run.
Return a json-like object representing this chain.
Static
deserializeLoad a chain from a json-like object describing it.
Static
fromLLMStatic method to create a new ConversationalRetrievalQAChain from a BaseLanguageModel and a BaseRetriever.
Toolkit instance used to generate a new question.
Toolkit instance used to retrieve relevant documents.
A new instance of ConversationalRetrievalQAChain.
Static
getStatic method to convert the chat history input into a formatted string.
Chat history input which can be a string, an array of BaseMessage instances, or an array of string arrays.
A formatted string representing the chat history.
Generated using TypeDoc
Class for conducting conversational question-answering tasks with a retrieval component. Extends the BaseChain class and implements the ConversationalRetrievalQAChainInput interface.
Example