跳到主要内容

SerpAPI

SerpAPI 允许您将搜索引擎结果集成到您的 LLM 应用程序中

本指南提供了一个快速概览,帮助您开始使用 SerpAPI 工具。有关所有 SerpAPI 功能和配置的详细文档,请访问 API 参考

概述

集成详情

PY 支持最新包
SerpAPI@langchain/communityNPM - Version

设置

该集成位于 @langchain/community 包中,您可以如下所示安装它

yarn add @langchain/community @langchain/core

凭据

在此处 设置 API 密钥,并将其设置为名为 SERPAPI_API_KEY 的环境变量。

process.env.SERPAPI_API_KEY = "YOUR_API_KEY";

设置 LangSmith 以获得一流的可观测性也很有帮助(但不是必需的)

process.env.LANGSMITH_TRACING = "true";
process.env.LANGSMITH_API_KEY = "your-api-key";

实例化

您可以导入并实例化 SerpAPI 工具的实例,如下所示

import { SerpAPI } from "@langchain/community/tools/serpapi";

const tool = new SerpAPI();

调用

直接使用参数调用

您可以像这样直接调用该工具

await tool.invoke({
input: "what is the current weather in SF?",
});
{"type":"weather_result","temperature":"63","unit":"Fahrenheit","precipitation":"3%","humidity":"91%","wind":"5 mph","location":"San Francisco, CA","date":"Sunday 9:00 AM","weather":"Mostly cloudy"}

使用 ToolCall 调用

我们还可以使用模型生成的 ToolCall 调用该工具,在这种情况下,将返回 ToolMessage

// This is usually generated by a model, but we'll create a tool call directly for demo purposes.
const modelGeneratedToolCall = {
args: {
input: "what is the current weather in SF?",
},
id: "1",
name: tool.name,
type: "tool_call",
};

await tool.invoke(modelGeneratedToolCall);
ToolMessage {
"content": "{\"type\":\"weather_result\",\"temperature\":\"63\",\"unit\":\"Fahrenheit\",\"precipitation\":\"3%\",\"humidity\":\"91%\",\"wind\":\"5 mph\",\"location\":\"San Francisco, CA\",\"date\":\"Sunday 9:00 AM\",\"weather\":\"Mostly cloudy\"}",
"name": "search",
"additional_kwargs": {},
"response_metadata": {},
"tool_call_id": "1"
}

链接

我们可以通过首先将其绑定到 工具调用模型,然后调用它,在链中使用我们的工具

选择您的聊天模型

安装依赖项

yarn add @langchain/groq 

添加环境变量

GROQ_API_KEY=your-api-key

实例化模型

import { ChatGroq } from "@langchain/groq";

const llm = new ChatGroq({
model: "llama-3.3-70b-versatile",
temperature: 0
});
import { HumanMessage } from "@langchain/core/messages";
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { RunnableLambda } from "@langchain/core/runnables";

const prompt = ChatPromptTemplate.fromMessages([
["system", "You are a helpful assistant."],
["placeholder", "{messages}"],
]);

const llmWithTools = llm.bindTools([tool]);

const chain = prompt.pipe(llmWithTools);

const toolChain = RunnableLambda.from(async (userInput: string, config) => {
const humanMessage = new HumanMessage(userInput);
const aiMsg = await chain.invoke(
{
messages: [new HumanMessage(userInput)],
},
config
);
const toolMsgs = await tool.batch(aiMsg.tool_calls, config);
return chain.invoke(
{
messages: [humanMessage, aiMsg, ...toolMsgs],
},
config
);
});

const toolChainResult = await toolChain.invoke(
"what is the current weather in sf?"
);
const { tool_calls, content } = toolChainResult;

console.log(
"AIMessage",
JSON.stringify(
{
tool_calls,
content,
},
null,
2
)
);
AIMessage {
"tool_calls": [],
"content": "The current weather in San Francisco is mostly cloudy, with a temperature of 64°F. The humidity is at 90%, there is a 3% chance of precipitation, and the wind is blowing at 5 mph."
}

Agents

有关如何在 Agent 中使用 LangChain 工具的指南,请参阅 LangGraph.js 文档。

API 参考

有关所有 SerpAPI 功能和配置的详细文档,请访问 API 参考:https://api.js.langchain.com/classes/\_langchain_community.tools_serpapi.SerpAPI.html


此页对您有帮助吗?


您也可以留下详细的反馈 在 GitHub 上.