如何取消执行
先决条件
本指南假设您熟悉以下概念
在构建更长时间运行的链或 LangGraph 代理时,您可能希望在某些情况下中断执行,例如用户离开您的应用程序或提交新的查询。
LangChain 表达式语言 (LCEL) 支持通过运行时 信号 选项中止正在进行的 Runnable。
兼容性
内置信号支持需要 @langchain/core>=0.2.20
。有关升级指南,请参见此处:安装集成包指南。
注意: 聊天模型或检索器等各个集成可能缺少或存在不同的中止执行实现。本指南中所述的信号支持将在链的步骤之间应用。
要了解其工作原理,请构建如下所示的链,该链执行 检索增强生成。它通过首先使用 Tavily 搜索网络,然后将结果传递给聊天模型以生成最终答案来回答问题。
选择您的聊天模型
- OpenAI
- Anthropic
- FireworksAI
- MistralAI
- Groq
- VertexAI
安装依赖项
提示
- npm
- yarn
- pnpm
npm i @langchain/openai
yarn add @langchain/openai
pnpm add @langchain/openai
添加环境变量
OPENAI_API_KEY=your-api-key
实例化模型
import { ChatOpenAI } from "@langchain/openai";
const model = new ChatOpenAI({
model: "gpt-4o-mini",
temperature: 0
});
安装依赖项
提示
- npm
- yarn
- pnpm
npm i @langchain/anthropic
yarn add @langchain/anthropic
pnpm add @langchain/anthropic
添加环境变量
ANTHROPIC_API_KEY=your-api-key
实例化模型
import { ChatAnthropic } from "@langchain/anthropic";
const model = new ChatAnthropic({
model: "claude-3-5-sonnet-20240620",
temperature: 0
});
安装依赖项
提示
- npm
- yarn
- pnpm
npm i @langchain/community
yarn add @langchain/community
pnpm add @langchain/community
添加环境变量
FIREWORKS_API_KEY=your-api-key
实例化模型
import { ChatFireworks } from "@langchain/community/chat_models/fireworks";
const model = new ChatFireworks({
model: "accounts/fireworks/models/llama-v3p1-70b-instruct",
temperature: 0
});
安装依赖项
提示
- npm
- yarn
- pnpm
npm i @langchain/mistralai
yarn add @langchain/mistralai
pnpm add @langchain/mistralai
添加环境变量
MISTRAL_API_KEY=your-api-key
实例化模型
import { ChatMistralAI } from "@langchain/mistralai";
const model = new ChatMistralAI({
model: "mistral-large-latest",
temperature: 0
});
安装依赖项
提示
- npm
- yarn
- pnpm
npm i @langchain/groq
yarn add @langchain/groq
pnpm add @langchain/groq
添加环境变量
GROQ_API_KEY=your-api-key
实例化模型
import { ChatGroq } from "@langchain/groq";
const model = new ChatGroq({
model: "mixtral-8x7b-32768",
temperature: 0
});
安装依赖项
提示
- npm
- yarn
- pnpm
npm i @langchain/google-vertexai
yarn add @langchain/google-vertexai
pnpm add @langchain/google-vertexai
添加环境变量
GOOGLE_APPLICATION_CREDENTIALS=credentials.json
实例化模型
import { ChatVertexAI } from "@langchain/google-vertexai";
const model = new ChatVertexAI({
model: "gemini-1.5-flash",
temperature: 0
});
import { TavilySearchAPIRetriever } from "@langchain/community/retrievers/tavily_search_api";
import type { Document } from "@langchain/core/documents";
import { StringOutputParser } from "@langchain/core/output_parsers";
import { ChatPromptTemplate } from "@langchain/core/prompts";
import {
RunnablePassthrough,
RunnableSequence,
} from "@langchain/core/runnables";
const formatDocsAsString = (docs: Document[]) => {
return docs.map((doc) => doc.pageContent).join("\n\n");
};
const retriever = new TavilySearchAPIRetriever({
k: 3,
});
const prompt = ChatPromptTemplate.fromTemplate(`
Use the following context to answer questions to the best of your ability:
<context>
{context}
</context>
Question: {question}`);
const chain = RunnableSequence.from([
{
context: retriever.pipe(formatDocsAsString),
question: new RunnablePassthrough(),
},
prompt,
llm,
new StringOutputParser(),
]);
如果您按常规调用它,您可以看到它返回最新信息
await chain.invoke("what is the current weather in SF?");
Based on the provided context, the current weather in San Francisco is:
Temperature: 17.6°C (63.7°F)
Condition: Sunny
Wind: 14.4 km/h (8.9 mph) from WSW direction
Humidity: 74%
Cloud cover: 15%
The information indicates it's a sunny day with mild temperatures and light winds. The data appears to be from August 2, 2024, at 17:00 local time.
现在,让我们尽早中断它。初始化一个 AbortController
并将其 signal
属性传递到链执行中。为了说明取消尽快发生的这一事实,请设置 100 毫秒的超时
const controller = new AbortController();
const startTimer = console.time("timer1");
setTimeout(() => controller.abort(), 100);
try {
await chain.invoke("what is the current weather in SF?", {
signal: controller.signal,
});
} catch (e) {
console.log(e);
}
console.timeEnd("timer1");
Error: Aborted
at EventTarget.<anonymous> (/Users/jacoblee/langchain/langchainjs/langchain-core/dist/utils/signal.cjs:19:24)
at [nodejs.internal.kHybridDispatch] (node:internal/event_target:825:20)
at EventTarget.dispatchEvent (node:internal/event_target:760:26)
at abortSignal (node:internal/abort_controller:370:10)
at AbortController.abort (node:internal/abort_controller:392:5)
at Timeout._onTimeout (evalmachine.<anonymous>:7:29)
at listOnTimeout (node:internal/timers:573:17)
at process.processTimers (node:internal/timers:514:7)
timer1: 103.204ms
您可以看到执行在 100 毫秒后结束。查看 此 LangSmith 跟踪,您可以看到模型从未被调用。
流式传输
您也可以在流式传输时传递 signal
。这使您可以更好地控制在 for await... of
循环中使用 break
语句来取消当前运行,这将仅在最终输出已开始流式传输后触发。以下示例使用 break
语句——请注意取消发生之前经过的时间
const startTimer2 = console.time("timer2");
const stream = await chain.stream("what is the current weather in SF?");
for await (const chunk of stream) {
console.log("chunk", chunk);
break;
}
console.timeEnd("timer2");
chunk
timer2: 3.990s
现在将此与使用信号进行比较。请注意,您需要将流包装在 try/catch
块中
const controllerForStream = new AbortController();
const startTimer3 = console.time("timer3");
setTimeout(() => controllerForStream.abort(), 100);
try {
const streamWithSignal = await chain.stream(
"what is the current weather in SF?",
{
signal: controllerForStream.signal,
}
);
for await (const chunk of streamWithSignal) {
console.log(chunk);
break;
}
} catch (e) {
console.log(e);
}
console.timeEnd("timer3");
Error: Aborted
at EventTarget.<anonymous> (/Users/jacoblee/langchain/langchainjs/langchain-core/dist/utils/signal.cjs:19:24)
at [nodejs.internal.kHybridDispatch] (node:internal/event_target:825:20)
at EventTarget.dispatchEvent (node:internal/event_target:760:26)
at abortSignal (node:internal/abort_controller:370:10)
at AbortController.abort (node:internal/abort_controller:392:5)
at Timeout._onTimeout (evalmachine.<anonymous>:7:38)
at listOnTimeout (node:internal/timers:573:17)
at process.processTimers (node:internal/timers:514:7)
timer3: 100.684ms