Interface ChatBedrockConverseCallOptions

interface ChatBedrockConverseCallOptions {
    additionalModelRequestFields?: DocumentType;
    callbacks?: Callbacks;
    configurable?: Record<string, any>;
    maxConcurrency?: number;
    metadata?: Record<string, unknown>;
    recursionLimit?: number;
    runId?: string;
    runName?: string;
    signal?: AbortSignal;
    stop?: string[];
    streamUsage?: boolean;
    tags?: string[];
    timeout?: number;
    tool_choice?: BedrockConverseToolChoice;
    tools?: ChatBedrockConverseToolType[];
}

Hierarchy

  • BaseChatModelCallOptions
  • Pick<ChatBedrockConverseInput, "additionalModelRequestFields" | "streamUsage">
    • ChatBedrockConverseCallOptions

Properties

additionalModelRequestFields?: DocumentType

Additional inference parameters that the model supports, beyond the base set of inference parameters that the Converse API supports in the inferenceConfig field. For more information, see the model parameters link below.

callbacks?: Callbacks

Callbacks for this call and any sub-calls (eg. a Chain calling an LLM). Tags are passed to all callbacks, metadata is passed to handle*Start callbacks.

configurable?: Record<string, any>

Runtime values for attributes previously made configurable on this Runnable, or sub-Runnables.

maxConcurrency?: number

Maximum number of parallel calls to make.

metadata?: Record<string, unknown>

Metadata for this call and any sub-calls (eg. a Chain calling an LLM). Keys should be strings, values should be JSON-serializable.

recursionLimit?: number

Maximum number of times a call can recurse. If not provided, defaults to 25.

runId?: string

Unique identifier for the tracer run for this call. If not provided, a new UUID will be generated.

runName?: string

Name for the tracer run for this call. Defaults to the name of the class.

signal?: AbortSignal

Abort signal for this call. If provided, the call will be aborted when the signal is aborted.

stop?: string[]

A list of stop sequences. A stop sequence is a sequence of characters that causes the model to stop generating the response.

streamUsage?: boolean

Whether or not to include usage data, like token counts in the streamed response chunks. Passing as a call option will take precedence over the class-level setting.

true
tags?: string[]

Tags for this call and any sub-calls (eg. a Chain calling an LLM). You can use these to filter calls.

timeout?: number

Timeout for this call in milliseconds.

tool_choice?: BedrockConverseToolChoice

Tool choice for the model. If passing a string, it must be "any", "auto" or the name of the tool to use. Or, pass a BedrockToolChoice object.

If "any" is passed, the model must request at least one tool. If "auto" is passed, the model automatically decides if a tool should be called or whether to generate text instead. If a tool name is passed, it will force the model to call that specific tool.