ChatGooglePaLM
可以通过首先安装所需的包来集成 谷歌 PaLM API
提示
- npm
- Yarn
- pnpm
npm install google-auth-library @google-ai/generativelanguage @langchain/community
yarn add google-auth-library @google-ai/generativelanguage @langchain/community
pnpm add google-auth-library @google-ai/generativelanguage @langchain/community
提示
我们正在统一所有包中的模型参数。我们现在建议使用 model
而不是 modelName
,以及 apiKey
用于 API 密钥。
从 谷歌 MakerSuite 创建一个 API 密钥。然后,您可以将密钥设置为 GOOGLE_PALM_API_KEY
环境变量,或者在实例化模型时将其作为 apiKey
参数传递。
import { ChatGooglePaLM } from "@langchain/community/chat_models/googlepalm";
import {
AIMessage,
HumanMessage,
SystemMessage,
} from "@langchain/core/messages";
export const run = async () => {
const model = new ChatGooglePaLM({
apiKey: "<YOUR API KEY>", // or set it in environment variable as `GOOGLE_PALM_API_KEY`
temperature: 0.7, // OPTIONAL
model: "models/chat-bison-001", // OPTIONAL
topK: 40, // OPTIONAL
topP: 1, // OPTIONAL
examples: [
// OPTIONAL
{
input: new HumanMessage("What is your favorite sock color?"),
output: new AIMessage("My favorite sock color be arrrr-ange!"),
},
],
});
// ask questions
const questions = [
new SystemMessage(
"You are a funny assistant that answers in pirate language."
),
new HumanMessage("What is your favorite food?"),
];
// You can also use the model as part of a chain
const res = await model.invoke(questions);
console.log({ res });
};
API 参考
- ChatGooglePaLM 来自
@langchain/community/chat_models/googlepalm
- AIMessage 来自
@langchain/core/messages
- HumanMessage 来自
@langchain/core/messages
- SystemMessage 来自
@langchain/core/messages
ChatGooglePaLM
LangChain.js 支持谷歌 Vertex AI 聊天模型作为集成。它支持两种不同的身份验证方法,具体取决于您是在 Node 环境还是 Web 环境中运行。
设置
Node
要在 Node 中调用 Vertex AI 模型,您需要安装 谷歌的官方身份验证客户端 作为对等依赖项。
您应该确保 Vertex AI API 已为相关项目启用,并且您已使用以下方法之一对 Google Cloud 进行了身份验证
- 您已登录到允许访问该项目的帐户(使用
gcloud auth application-default login
)。 - 您正在运行的机器使用允许访问该项目的服务帐户。
- 您已下载允许访问该项目的服务帐户的凭据,并将
GOOGLE_APPLICATION_CREDENTIALS
环境变量设置为此文件的路径。
提示
- npm
- Yarn
- pnpm
npm install google-auth-library @langchain/community
yarn add google-auth-library @langchain/community
pnpm add google-auth-library @langchain/community
Web
要在 Web 环境(如边缘函数)中调用 Vertex AI 模型,您需要安装 web-auth-library
包作为对等依赖项
- npm
- Yarn
- pnpm
npm install web-auth-library
yarn add web-auth-library
pnpm add web-auth-library
然后,您需要将您的服务帐户凭据直接添加为 GOOGLE_VERTEX_AI_WEB_CREDENTIALS
环境变量
GOOGLE_VERTEX_AI_WEB_CREDENTIALS={"type":"service_account","project_id":"YOUR_PROJECT-12345",...}
您也可以像这样在代码中直接传递您的凭据
import { ChatGoogleVertexAI } from "@langchain/community/chat_models/googlevertexai";
const model = new ChatGoogleVertexAI({
authOptions: {
credentials: {"type":"service_account","project_id":"YOUR_PROJECT-12345",...},
},
});
使用
几种模型可用,可以通过构造函数中的 model
属性指定。这些包括
- code-bison(默认)
- code-bison-32k
ChatGoogleVertexAI 类与其他基于聊天的 LLM 类似,但有一些例外
- 传递的第一个
SystemMessage
映射到 PaLM 模型期望的“context”参数。不允许使用其他SystemMessages
。 - 在第一个
SystemMessage
之后,必须有奇数个消息,代表人与模型之间的对话。 - 人类消息必须与 AI 消息交替出现。
import { ChatGoogleVertexAI } from "@langchain/community/chat_models/googlevertexai";
// Or, if using the web entrypoint:
// import { ChatGoogleVertexAI } from "@langchain/community/chat_models/googlevertexai/web";
const model = new ChatGoogleVertexAI({
temperature: 0.7,
});
API 参考
- ChatGoogleVertexAI 来自
@langchain/community/chat_models/googlevertexai
流式
ChatGoogleVertexAI 还支持将多个块流式传输以实现更快的响应
import { ChatGoogleVertexAI } from "@langchain/community/chat_models/googlevertexai";
// Or, if using the web entrypoint:
// import { ChatGoogleVertexAI } from "@langchain/community/chat_models/googlevertexai/web";
const model = new ChatGoogleVertexAI({
temperature: 0.7,
});
const stream = await model.stream([
["system", "You are a funny assistant that answers in pirate language."],
["human", "What is your favorite food?"],
]);
for await (const chunk of stream) {
console.log(chunk);
}
/*
AIMessageChunk {
content: ' Ahoy there, matey! My favorite food be fish, cooked any way ye ',
additional_kwargs: {}
}
AIMessageChunk {
content: 'like!',
additional_kwargs: {}
}
AIMessageChunk {
content: '',
name: undefined,
additional_kwargs: {}
}
*/
API 参考
- ChatGoogleVertexAI 来自
@langchain/community/chat_models/googlevertexai
示例
还有一个可选的 examples
构造函数参数,它可以帮助模型理解适当的响应应该是什么样子。
import { ChatGoogleVertexAI } from "@langchain/community/chat_models/googlevertexai";
import {
AIMessage,
HumanMessage,
SystemMessage,
} from "@langchain/core/messages";
// Or, if using the web entrypoint:
// import { ChatGoogleVertexAI } from "@langchain/community/chat_models/googlevertexai/web";
const examples = [
{
input: new HumanMessage("What is your favorite sock color?"),
output: new AIMessage("My favorite sock color be arrrr-ange!"),
},
];
const model = new ChatGoogleVertexAI({
temperature: 0.7,
examples,
});
const questions = [
new SystemMessage(
"You are a funny assistant that answers in pirate language."
),
new HumanMessage("What is your favorite food?"),
];
// You can also use the model as part of a chain
const res = await model.invoke(questions);
console.log({ res });
API 参考
- ChatGoogleVertexAI 来自
@langchain/community/chat_models/googlevertexai
- AIMessage 来自
@langchain/core/messages
- HumanMessage 来自
@langchain/core/messages
- SystemMessage 来自
@langchain/core/messages