Chain to execute tasks.

Hierarchy (view full)

Constructors

Properties

llm: any

LLM Wrapper to use

outputKey: string = "text"

Key to use for output, defaults to text

prompt: BasePromptTemplate

Prompt object to use

llmKwargs?: any

Kwargs to pass to LLM

memory?: any
outputParser?: any

OutputParser to use

Accessors

Methods

  • Parameters

    • inputs: ChainValues[]
    • Optional config: any[]

    Returns Promise<ChainValues[]>

    ⚠️ Deprecated ⚠️

    Use .batch() instead. Will be removed in 0.2.0.

    This feature is deprecated and will be removed in the future.

    It is not recommended for use.

    Call the chain on all inputs in the list

  • Run the core logic of this chain and add to output if desired.

    Wraps _call and handles memory.

    Parameters

    • values: any
    • Optional config: any

    Returns Promise<ChainValues>

  • Invoke the chain with the provided input and returns the output.

    Parameters

    • input: ChainValues

      Input values for the chain run.

    • Optional options: any

    Returns Promise<ChainValues>

    Promise that resolves with the output of the chain run.

  • Format prompt with values and pass to LLM

    Parameters

    • values: any

      keys to pass to prompt template

    • Optional callbackManager: any

      CallbackManager to use

    Returns Promise<string>

    Completion from LLM.

    Example

    llm.predict({ adjective: "funny" })
    
  • Parameters

    • inputs: Record<string, unknown>
    • outputs: Record<string, unknown>
    • returnOnlyOutputs: boolean = false

    Returns Promise<Record<string, unknown>>

  • Parameters

    • input: any
    • Optional config: any

    Returns Promise<string>

    Deprecated

    Use .invoke() instead. Will be removed in 0.2.0.

  • A static factory method that creates an instance of TaskExecutionChain. It constructs a prompt template for task execution, which is then used to create a new instance of TaskExecutionChain. The prompt template instructs an AI to perform a task based on a given objective, taking into account previously completed tasks.

    Parameters

    • fields: Omit<LLMChainInput<string, any>, "prompt">

      An object of type LLMChainInput, excluding the "prompt" field.

    Returns LLMChain<string, any>

    An instance of LLMChain.

Generated using TypeDoc