Integration with AWS Bedrock Converse API.

Example

import { ChatBedrockConverse } from "@langchain/aws";

const model = new ChatBedrockConverse({
region: process.env.BEDROCK_AWS_REGION ?? "us-east-1",
credentials: {
secretAccessKey: process.env.BEDROCK_AWS_SECRET_ACCESS_KEY!,
accessKeyId: process.env.BEDROCK_AWS_ACCESS_KEY_ID!,
},
});

const res = await model.invoke([new HumanMessage("Print hello world")]);

Hierarchy (view full)

Implements

Constructors

Properties

client: BedrockRuntimeClient
model: string = "anthropic.claude-3-haiku-20240307-v1:0"

Model to use. For example, "anthropic.claude-3-haiku-20240307-v1:0", this is equivalent to the modelId property in the list-foundation-models api. See the below link for a full list of models.

Link

https://docs.aws.amazon.com/bedrock/latest/userguide/model-ids.html#model-ids-arns

Default

anthropic.claude-3-haiku-20240307-v1:0
region: string

The AWS region e.g. us-west-2. Fallback to AWS_DEFAULT_REGION env variable or region specified in ~/.aws/config in case it is not provided here.

streamUsage: boolean = true

Whether or not to include usage data, like token counts in the streamed response chunks. Passing as a call option will take precedence over the class-level setting.

Default

true
streaming: boolean = false

Whether or not to stream responses

additionalModelRequestFields?: null | string | number | boolean | DocumentType[] | {
    [prop: string]: DocumentType;
}

Additional inference parameters that the model supports, beyond the base set of inference parameters that the Converse API supports in the inferenceConfig field. For more information, see the model parameters link below.

Type declaration

  • [prop: string]: DocumentType
endpointHost?: string

Override the default endpoint hostname.

guardrailConfig?: GuardrailConfiguration

Configuration information for a guardrail that you want to use in the request.

maxTokens?: number = undefined

Max tokens.

temperature?: number = undefined

Temperature.

topP?: number

The percentage of most-likely candidates that the model considers for the next token. For example, if you choose a value of 0.8 for topP, the model selects from the top 80% of the probability distribution of tokens that could be next in the sequence. The default value is the default value for the model that you are using. For more information, see the inference parameters for foundation models link below.

Methods

  • Parameters

    • tools: any[]
    • Optional kwargs: Partial<unknown>

    Returns Runnable<BaseLanguageModelInput, AIMessageChunk, this["ParsedCallOptions"]>

Generated using TypeDoc