interface Choice {
    finish_reason:
        | "length"
        | "function_call"
        | "tool_calls"
        | "stop"
        | "content_filter";
    index: number;
    logprobs: null | OpenAIClient.Chat.Completions.ChatCompletion.Choice.Logprobs;
    message: ChatCompletionMessage;
}

Properties

finish_reason:
    | "length"
    | "function_call"
    | "tool_calls"
    | "stop"
    | "content_filter"

The reason the model stopped generating tokens. This will be stop if the model hit a natural stop point or a provided stop sequence, length if the maximum number of tokens specified in the request was reached, content_filter if content was omitted due to a flag from our content filters, tool_calls if the model called a tool, or function_call (deprecated) if the model called a function.

index: number

The index of the choice in the list of choices.

Log probability information for the choice.

A chat completion message generated by the model.