LLM Wrapper to use
Key to use for output, defaults to text
Prompt object to use
Optional
llmKwargs to pass to LLM
Optional
memoryOptional
outputOutputParser to use
Invoke the chain with the provided input and returns the output.
Input values for the chain run.
Optional
config: BaseCallbackConfigOptional configuration for the Runnable.
Promise that resolves with the output of the chain run.
Format prompt with values and pass to LLM
keys to pass to prompt template
Optional
callbackManager: CallbackManagerCallbackManager to use
Completion from LLM.
llm.predict({ adjective: "funny" })
Static
deserializeLoad a chain from a json-like object describing it.
Generated using TypeDoc
Chain to run queries against LLMs.
Example