跳至主要内容

DuckDuckGoSearch

本笔记本提供了有关如何开始使用 DuckDuckGoSearch 的快速概述。有关所有 DuckDuckGoSearch 功能和配置的详细文档,请访问 API 参考

DuckDuckGoSearch 提供了一个专为 LLM 代理设计的注重隐私的搜索 API。它与各种数据源无缝集成,优先考虑用户隐私和相关搜索结果。

概述

集成详细信息

PY 支持最新包
DuckDuckGoSearch@langchain/communityNPM - Version

设置

集成位于 @langchain/community 包中,以及 duck-duck-scrape 依赖项。

提示

有关安装集成包的一般说明,请参阅 此部分

yarn add @langchain/community duck-duck-scrape

凭据

为获得最佳的可观察性,建议您设置 LangSmith (并非必需)。

process.env.LANGCHAIN_TRACING_V2 = "true";
process.env.LANGCHAIN_API_KEY = "your-api-key";

实例化

您可以像这样实例化 DuckDuckGoSearch 工具的实例。

import { DuckDuckGoSearch } from "@langchain/community/tools/duckduckgo_search";

const tool = new DuckDuckGoSearch({ maxResults: 1 });

调用

直接使用参数调用

await tool.invoke("what is the current weather in sf?");
[{"title":"San Francisco, CA Current Weather | AccuWeather","link":"https://www.accuweather.com/en/us/san-francisco/94103/current-weather/347629","snippet":"<b>Current</b> <b>weather</b> <b>in</b> San Francisco, CA. Check <b>current</b> conditions in San Francisco, CA with radar, hourly, and more."}]

使用 ToolCall 调用

我们也可以使用模型生成的 ToolCall 调用该工具,在这种情况下将返回一个 ToolMessage

// This is usually generated by a model, but we'll create a tool call directly for demo purposes.
const modelGeneratedToolCall = {
args: {
input: "what is the current weather in sf?",
},
id: "tool_call_id",
name: tool.name,
type: "tool_call",
};
await tool.invoke(modelGeneratedToolCall);
ToolMessage {
"content": "[{\"title\":\"San Francisco, CA Weather Conditions | Weather Underground\",\"link\":\"https://www.wunderground.com/weather/us/ca/san-francisco\",\"snippet\":\"San Francisco <b>Weather</b> Forecasts. <b>Weather</b> Underground provides local & long-range <b>weather</b> forecasts, weatherreports, maps & tropical <b>weather</b> conditions for the San Francisco area.\"}]",
"name": "duckduckgo-search",
"additional_kwargs": {},
"response_metadata": {},
"tool_call_id": "tool_call_id"
}

链接

我们可以通过首先将工具绑定到 工具调用模型,然后调用它,在链中使用我们的工具。

选择您的聊天模型

安装依赖项

yarn add @langchain/openai 

添加环境变量

OPENAI_API_KEY=your-api-key

实例化模型

import { ChatOpenAI } from "@langchain/openai";

const llm = new ChatOpenAI({
model: "gpt-4o-mini",
temperature: 0
});
import { HumanMessage } from "@langchain/core/messages";
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { RunnableLambda } from "@langchain/core/runnables";

const prompt = ChatPromptTemplate.fromMessages([
["system", "You are a helpful assistant."],
["placeholder", "{messages}"],
]);

const llmWithTools = llm.bindTools([tool]);

const chain = prompt.pipe(llmWithTools);

const toolChain = RunnableLambda.from(async (userInput: string, config) => {
const humanMessage = new HumanMessage(userInput);
const aiMsg = await chain.invoke(
{
messages: [new HumanMessage(userInput)],
},
config
);
const toolMsgs = await tool.batch(aiMsg.tool_calls, config);
return chain.invoke(
{
messages: [humanMessage, aiMsg, ...toolMsgs],
},
config
);
});

const toolChainResult = await toolChain.invoke(
"how many people have climbed mount everest?"
);
const { tool_calls, content } = toolChainResult;

console.log(
"AIMessage",
JSON.stringify(
{
tool_calls,
content,
},
null,
2
)
);
AIMessage {
"tool_calls": [],
"content": "As of December 2023, a total of 6,664 different people have reached the summit of Mount Everest."
}

代理

有关如何在代理中使用 LangChain 工具的指南,请参阅 LangGraph.js 文档。

API 参考

有关所有 DuckDuckGoSearch 功能和配置的详细文档,请访问 API 参考


此页面是否有帮助?


您也可以留下详细的反馈 在 GitHub 上.