Optional
memoryOptional
config: any[]Use .batch() instead. Will be removed in 0.2.0.
This feature is deprecated and will be removed in the future.
It is not recommended for use.
Call the chain on all inputs in the list
Optional
config: anyOptional
tags: string[]Use .invoke() instead. Will be removed in 0.2.0.
Run the core logic of this chain and add to output if desired.
Wraps _call and handles memory.
Invoke the chain with the provided input and returns the output.
Input values for the chain run.
Optional
config: anyOptional configuration for the Runnable.
Promise that resolves with the output of the chain run.
Return a json-like object representing this chain.
Static
deserializeLoad a chain from a json-like object describing it.
Static
fromLLMAndA static method that creates an instance of MultiPromptChain from a BaseLanguageModel and a set of prompts. It takes in optional parameters for the default chain and additional options.
A BaseLanguageModel instance.
Optional
conversationOptional
defaultOptional
llmOptional
multiAn instance of MultiPromptChain.
Static
fromOptional
defaultChain: BaseChain<ChainValues, ChainValues>Optional
options: Omit<MultiRouteChainInput, "defaultChain">Use fromLLMAndPrompts
instead
Generated using TypeDoc
A class that represents a multi-prompt chain in the LangChain framework. It extends the MultiRouteChain class and provides additional functionality specific to multi-prompt chains.
Example