跳至主要内容

ChatAnthropic

Anthropic 是一家 AI 安全和研究公司。他们是 Claude 的创造者。

这将帮助您开始使用 Anthropic 聊天模型。有关所有 ChatAnthropic 功能和配置的详细文档,请访问 API 参考

概述

集成详细信息

本地可序列化PY 支持包下载包最新
ChatAnthropic@langchain/anthropicNPM - DownloadsNPM - Version

模型功能

有关如何使用特定功能的指南,请参阅下方表格标题中的链接。

工具调用结构化输出JSON 模式图像输入音频输入视频输入令牌级流令牌使用情况Logprobs

设置

您需要注册并获取 Anthropic API 密钥,并安装 @langchain/anthropic 集成包。

凭据

访问 Anthropic 的网站 注册 Anthropic 并生成 API 密钥。完成此操作后,设置 ANTHROPIC_API_KEY 环境变量

export ANTHROPIC_API_KEY="your-api-key"

如果您想获取模型调用的自动跟踪,还可以通过取消下面的注释来设置 LangSmith API 密钥

# export LANGCHAIN_TRACING_V2="true"
# export LANGCHAIN_API_KEY="your-api-key"

安装

LangChain ChatAnthropic 集成位于 @langchain/anthropic 包中

提示

有关安装集成包的一般说明,请参阅 此部分

yarn add @langchain/anthropic @langchain/core

实例化

现在我们可以实例化我们的模型对象并生成聊天完成

import { ChatAnthropic } from "@langchain/anthropic";

const llm = new ChatAnthropic({
model: "claude-3-haiku-20240307",
temperature: 0,
maxTokens: undefined,
maxRetries: 2,
// other params...
});

调用

const aiMsg = await llm.invoke([
[
"system",
"You are a helpful assistant that translates English to French. Translate the user sentence.",
],
["human", "I love programming."],
]);
aiMsg;
AIMessage {
"id": "msg_013WBXXiggy6gMbAUY6NpsuU",
"content": "Voici la traduction en français :\n\nJ'adore la programmation.",
"additional_kwargs": {
"id": "msg_013WBXXiggy6gMbAUY6NpsuU",
"type": "message",
"role": "assistant",
"model": "claude-3-haiku-20240307",
"stop_reason": "end_turn",
"stop_sequence": null,
"usage": {
"input_tokens": 29,
"output_tokens": 20
}
},
"response_metadata": {
"id": "msg_013WBXXiggy6gMbAUY6NpsuU",
"model": "claude-3-haiku-20240307",
"stop_reason": "end_turn",
"stop_sequence": null,
"usage": {
"input_tokens": 29,
"output_tokens": 20
},
"type": "message",
"role": "assistant"
},
"tool_calls": [],
"invalid_tool_calls": [],
"usage_metadata": {
"input_tokens": 29,
"output_tokens": 20,
"total_tokens": 49
}
}
console.log(aiMsg.content);
Voici la traduction en français :

J'adore la programmation.

链接

我们可以使用提示模板 链接 我们的模型,如下所示

import { ChatPromptTemplate } from "@langchain/core/prompts";

const prompt = ChatPromptTemplate.fromMessages([
[
"system",
"You are a helpful assistant that translates {input_language} to {output_language}.",
],
["human", "{input}"],
]);

const chain = prompt.pipe(llm);
await chain.invoke({
input_language: "English",
output_language: "German",
input: "I love programming.",
});
AIMessage {
"id": "msg_01Ca52fpd1mcGRhH4spzAWr4",
"content": "Ich liebe das Programmieren.",
"additional_kwargs": {
"id": "msg_01Ca52fpd1mcGRhH4spzAWr4",
"type": "message",
"role": "assistant",
"model": "claude-3-haiku-20240307",
"stop_reason": "end_turn",
"stop_sequence": null,
"usage": {
"input_tokens": 23,
"output_tokens": 11
}
},
"response_metadata": {
"id": "msg_01Ca52fpd1mcGRhH4spzAWr4",
"model": "claude-3-haiku-20240307",
"stop_reason": "end_turn",
"stop_sequence": null,
"usage": {
"input_tokens": 23,
"output_tokens": 11
},
"type": "message",
"role": "assistant"
},
"tool_calls": [],
"invalid_tool_calls": [],
"usage_metadata": {
"input_tokens": 23,
"output_tokens": 11,
"total_tokens": 34
}
}

内容块

需要注意 Anthropic 模型与大多数其他模型之间的一个关键区别是,单个 Anthropic AI 消息的内容可以是单个字符串或**内容块列表**。例如,当 Anthropic 模型 调用工具 时,工具调用是消息内容的一部分(以及在标准化的 AIMessage.tool_calls 字段中公开)

import { ChatAnthropic } from "@langchain/anthropic";
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { z } from "zod";
import { zodToJsonSchema } from "zod-to-json-schema";

const calculatorSchema = z.object({
operation: z
.enum(["add", "subtract", "multiply", "divide"])
.describe("The type of operation to execute."),
number1: z.number().describe("The first number to operate on."),
number2: z.number().describe("The second number to operate on."),
});

const calculatorTool = {
name: "calculator",
description: "A simple calculator tool",
input_schema: zodToJsonSchema(calculatorSchema),
};

const toolCallingLlm = new ChatAnthropic({
model: "claude-3-haiku-20240307",
}).bindTools([calculatorTool]);

const toolPrompt = ChatPromptTemplate.fromMessages([
[
"system",
"You are a helpful assistant who always needs to use a calculator.",
],
["human", "{input}"],
]);

// Chain your prompt and model together
const toolCallChain = toolPrompt.pipe(toolCallingLlm);

await toolCallChain.invoke({
input: "What is 2 + 2?",
});
AIMessage {
"id": "msg_01DZGs9DyuashaYxJ4WWpWUP",
"content": [
{
"type": "text",
"text": "Here is the calculation for 2 + 2:"
},
{
"type": "tool_use",
"id": "toolu_01SQXBamkBr6K6NdHE7GWwF8",
"name": "calculator",
"input": {
"number1": 2,
"number2": 2,
"operation": "add"
}
}
],
"additional_kwargs": {
"id": "msg_01DZGs9DyuashaYxJ4WWpWUP",
"type": "message",
"role": "assistant",
"model": "claude-3-haiku-20240307",
"stop_reason": "tool_use",
"stop_sequence": null,
"usage": {
"input_tokens": 449,
"output_tokens": 100
}
},
"response_metadata": {
"id": "msg_01DZGs9DyuashaYxJ4WWpWUP",
"model": "claude-3-haiku-20240307",
"stop_reason": "tool_use",
"stop_sequence": null,
"usage": {
"input_tokens": 449,
"output_tokens": 100
},
"type": "message",
"role": "assistant"
},
"tool_calls": [
{
"name": "calculator",
"args": {
"number1": 2,
"number2": 2,
"operation": "add"
},
"id": "toolu_01SQXBamkBr6K6NdHE7GWwF8",
"type": "tool_call"
}
],
"invalid_tool_calls": [],
"usage_metadata": {
"input_tokens": 449,
"output_tokens": 100,
"total_tokens": 549
}
}

自定义标头

您可以像这样在您的请求中传递自定义标头

import { ChatAnthropic } from "@langchain/anthropic";

const llmWithCustomHeaders = new ChatAnthropic({
model: "claude-3-sonnet-20240229",
maxTokens: 1024,
clientOptions: {
defaultHeaders: {
"X-Api-Key": process.env.ANTHROPIC_API_KEY,
},
},
});

await llmWithCustomHeaders.invoke("Why is the sky blue?");
AIMessage {
"id": "msg_019z4nWpShzsrbSHTWXWQh6z",
"content": "The sky appears blue due to a phenomenon called Rayleigh scattering. Here's a brief explanation:\n\n1) Sunlight is made up of different wavelengths of visible light, including all the colors of the rainbow.\n\n2) As sunlight passes through the atmosphere, the gases (mostly nitrogen and oxygen) cause the shorter wavelengths of light, such as violet and blue, to be scattered more easily than the longer wavelengths like red and orange.\n\n3) This scattering of the shorter blue wavelengths occurs in all directions by the gas molecules in the atmosphere.\n\n4) Our eyes are more sensitive to the scattered blue light than the scattered violet light, so we perceive the sky as having a blue color.\n\n5) The scattering is more pronounced for light traveling over longer distances through the atmosphere. This is why the sky appears even darker blue when looking towards the horizon.\n\nSo in essence, the selective scattering of the shorter blue wavelengths of sunlight by the gases in the atmosphere is what causes the sky to appear blue to our eyes during the daytime.",
"additional_kwargs": {
"id": "msg_019z4nWpShzsrbSHTWXWQh6z",
"type": "message",
"role": "assistant",
"model": "claude-3-sonnet-20240229",
"stop_reason": "end_turn",
"stop_sequence": null,
"usage": {
"input_tokens": 13,
"output_tokens": 236
}
},
"response_metadata": {
"id": "msg_019z4nWpShzsrbSHTWXWQh6z",
"model": "claude-3-sonnet-20240229",
"stop_reason": "end_turn",
"stop_sequence": null,
"usage": {
"input_tokens": 13,
"output_tokens": 236
},
"type": "message",
"role": "assistant"
},
"tool_calls": [],
"invalid_tool_calls": [],
"usage_metadata": {
"input_tokens": 13,
"output_tokens": 236,
"total_tokens": 249
}
}

提示缓存

兼容性

此功能目前处于测试阶段。

Anthropic 支持 缓存提示的一部分,以降低需要长上下文的使用案例的成本。您可以缓存工具以及整个消息和各个块。

包含一个或多个带有 "cache_control": { "type": "ephemeral" } 字段的块或工具定义的初始请求将自动缓存提示的该部分。此初始缓存步骤将产生额外费用,但后续请求的计费将以较低的费率进行。缓存的有效期为 5 分钟,但每次命中缓存时都会刷新。

目前还存在最小的可缓存提示长度,该长度根据模型的不同而有所不同。您可以在 此处 查看此信息。

目前这需要您使用测试版标头初始化您的模型。以下是一个缓存包含 LangChain 概念文档 的系统消息一部分的示例

let CACHED_TEXT = "...";
import { ChatAnthropic } from "@langchain/anthropic";
import { HumanMessage, SystemMessage } from "@langchain/core/messages";

const modelWithCaching = new ChatAnthropic({
model: "claude-3-haiku-20240307",
clientOptions: {
defaultHeaders: {
"anthropic-beta": "prompt-caching-2024-07-31",
},
},
});

const LONG_TEXT = `You are a pirate. Always respond in pirate dialect.

Use the following as context when answering questions:

${CACHED_TEXT}`;

const messages = [
new SystemMessage({
content: [
{
type: "text",
text: LONG_TEXT,
// Tell Anthropic to cache this block
cache_control: { type: "ephemeral" },
},
],
}),
new HumanMessage({
content: "What types of messages are supported in LangChain?",
}),
];

const res = await modelWithCaching.invoke(messages);

console.log("USAGE:", res.response_metadata.usage);
USAGE: {
input_tokens: 19,
cache_creation_input_tokens: 2925,
cache_read_input_tokens: 0,
output_tokens: 327
}

我们可以看到,Anthropic 返回的原始使用情况字段中有一个名为 cache_creation_input_tokens 的新字段。

如果我们再次使用相同的消息,我们可以看到长文本的输入令牌是从缓存中读取的

const res2 = await modelWithCaching.invoke(messages);

console.log("USAGE:", res2.response_metadata.usage);
USAGE: {
input_tokens: 19,
cache_creation_input_tokens: 0,
cache_read_input_tokens: 2925,
output_tokens: 250
}

工具缓存

您也可以通过在工具定义中设置相同的"cache_control": { "type": "ephemeral" }来缓存工具。目前,这需要您将工具绑定在Anthropic 的原始工具格式中。以下是一个示例

const SOME_LONG_DESCRIPTION = "...";

// Tool in Anthropic format
const anthropicTools = [
{
name: "get_weather",
description: SOME_LONG_DESCRIPTION,
input_schema: {
type: "object",
properties: {
location: {
type: "string",
description: "Location to get the weather for",
},
unit: {
type: "string",
description: "Temperature unit to return",
},
},
required: ["location"],
},
// Tell Anthropic to cache this tool
cache_control: { type: "ephemeral" },
},
];

const modelWithCachedTools = modelWithCaching.bindTools(anthropicTools);

await modelWithCachedTools.invoke("what is the weather in SF?");

有关提示缓存工作原理的更多信息,请参阅Anthropic 的文档.

自定义客户端

Anthropic 模型可能会托管在 Google Vertex 等云服务上,这些服务依赖于与主要 Anthropic 客户端具有相同界面的不同底层客户端。您可以通过提供一个返回已初始化的 Anthropic 客户端实例的createClient方法来访问这些服务。以下是一个示例

import { AnthropicVertex } from "@anthropic-ai/vertex-sdk";

const customClient = new AnthropicVertex();

const modelWithCustomClient = new ChatAnthropic({
modelName: "claude-3-sonnet@20240229",
maxRetries: 0,
createClient: () => customClient,
});

await modelWithCustomClient.invoke([{ role: "user", content: "Hello!" }]);

API 参考

有关所有 ChatAnthropic 功能和配置的详细文档,请前往 API 参考:https://api.js.langchain.com/classes/langchain_anthropic.ChatAnthropic.html


此页面是否有帮助?


您也可以留下详细的反馈 在 GitHub 上.