Optional
additionalOptional
callbacksCallbacks for this call and any sub-calls (eg. a Chain calling an LLM). Tags are passed to all callbacks, metadata is passed to handle*Start callbacks.
Optional
configurableRuntime values for attributes previously made configurable on this Runnable, or sub-Runnables.
Optional
maxMaximum number of parallel calls to make.
Optional
metadataMetadata for this call and any sub-calls (eg. a Chain calling an LLM). Keys should be strings, values should be JSON-serializable.
Optional
recursionMaximum number of times a call can recurse. If not provided, defaults to 25.
Optional
runUnique identifier for the tracer run for this call. If not provided, a new UUID will be generated.
Optional
runName for the tracer run for this call. Defaults to the name of the class.
Optional
signalAbort signal for this call. If provided, the call will be aborted when the signal is aborted.
Optional
stopA list of stop sequences. A stop sequence is a sequence of characters that causes the model to stop generating the response.
Optional
streamWhether or not to include usage data, like token counts in the streamed response chunks. Passing as a call option will take precedence over the class-level setting.
Optional
tagsTags for this call and any sub-calls (eg. a Chain calling an LLM). You can use these to filter calls.
Optional
timeoutTimeout for this call in milliseconds.
Optional
tool_Tool choice for the model. If passing a string, it must be "any", "auto" or the name of the tool to use. Or, pass a BedrockToolChoice object.
If "any" is passed, the model must request at least one tool. If "auto" is passed, the model automatically decides if a tool should be called or whether to generate text instead. If a tool name is passed, it will force the model to call that specific tool.
Optional
tools
Additional inference parameters that the model supports, beyond the base set of inference parameters that the Converse API supports in the
inferenceConfig
field. For more information, see the model parameters link below.