跳至主要内容

Zep 云内存

从聊天历史记录中回忆、理解和提取数据。为个性化的 AI 体验提供支持。

Zep 是一个面向 AI 助理应用程序的长期内存服务。使用 Zep,您可以为 AI 助理提供回忆过去对话的能力,无论这些对话是多么久远,同时还可以减少幻觉、延迟和成本。

Zep 云的工作原理

Zep 持久保存并回忆聊天历史记录,并自动从这些聊天历史记录中生成摘要和其他内容。它还嵌入消息和摘要,使您可以搜索 Zep 以获取来自过去对话的相关上下文。Zep 异步执行所有这些操作,确保这些操作不会影响用户的聊天体验。数据持久保存到数据库,使您能够在增长需求时扩展。

Zep 还提供了一个简单易用的文档向量搜索抽象,称为“文档集合”。这旨在补充 Zep 的核心内存功能,但并非旨在成为通用的向量数据库。

Zep 允许您更刻意地构建您的提示

  • 自动添加一些最近的消息,数量可根据您的应用程序进行自定义;
  • 上述消息之前最近对话的摘要;
  • 以及/或者从整个聊天会话中获取的相关上下文摘要或消息。
  • 以及/或者来自 Zep 文档集合的相关业务数据。

Zep 云提供

  • 事实提取:从对话中自动构建事实表,无需预先定义数据模式。
  • 对话分类:即时准确地对聊天对话进行分类。了解用户的意图和情感,细分用户等等。根据语义上下文路由链,并触发事件。
  • 结构化数据提取:使用您定义的模式,快速从聊天对话中提取业务数据。了解您的助手应该询问什么才能完成其任务。

安装

注册 Zep 云 并创建一个项目。

按照 Zep 云 Typescript SDK 安装指南 安装 Zep 并开始使用它。

您需要您的 Zep 云项目 API 密钥才能使用 Zep 云内存。有关更多信息,请参阅 Zep 云文档

npm install @getzep/zep-cloud @langchain/openai @langchain/community @langchain/core

ZepCloudChatMessageHistory + RunnableWithMessageHistory 用法

import { ZepClient } from "@getzep/zep-cloud";
import {
ChatPromptTemplate,
MessagesPlaceholder,
} from "@langchain/core/prompts";
import { ConsoleCallbackHandler } from "@langchain/core/tracers/console";
import { ChatOpenAI } from "@langchain/openai";
import { RunnableWithMessageHistory } from "@langchain/core/runnables";
import { ZepCloudChatMessageHistory } from "@langchain/community/stores/message/zep_cloud";

// Your Zep Session ID.
const sessionId = "<Zep Session ID>";
const zepClient = new ZepClient({
// Your Zep Cloud Project API key https://help.getzep.com/projects
apiKey: "<Zep Api Key>",
});

const prompt = ChatPromptTemplate.fromMessages([
["system", "Answer the user's question below. Be polite and helpful:"],
new MessagesPlaceholder("history"),
["human", "{question}"],
]);

const chain = prompt
.pipe(
new ChatOpenAI({
temperature: 0.8,
modelName: "gpt-3.5-turbo-1106",
})
)
.withConfig({
callbacks: [new ConsoleCallbackHandler()],
});

const chainWithHistory = new RunnableWithMessageHistory({
runnable: chain,
getMessageHistory: (sessionId) =>
new ZepCloudChatMessageHistory({
client: zepClient,
sessionId,
memoryType: "perpetual",
}),
inputMessagesKey: "question",
historyMessagesKey: "history",
});

const result = await chainWithHistory.invoke(
{
question: "What did we talk about earlier?",
},
{
configurable: {
sessionId,
},
}
);

console.log("result", result);

API 参考

ZepCloudChatMessageHistory + RunnableWithMessageHistory + ZepVectorStore (作为检索器) 用法

import { ZepClient } from "@getzep/zep-cloud";
import {
ChatPromptTemplate,
MessagesPlaceholder,
} from "@langchain/core/prompts";
import { ConsoleCallbackHandler } from "@langchain/core/tracers/console";
import { ChatOpenAI } from "@langchain/openai";
import { Document } from "@langchain/core/documents";
import {
RunnableLambda,
RunnableMap,
RunnablePassthrough,
RunnableWithMessageHistory,
} from "@langchain/core/runnables";
import { ZepCloudVectorStore } from "@langchain/community/vectorstores/zep_cloud";
import { StringOutputParser } from "@langchain/core/output_parsers";
import { ZepCloudChatMessageHistory } from "@langchain/community/stores/message/zep_cloud";

interface ChainInput {
question: string;
sessionId: string;
}

async function combineDocuments(docs: Document[], documentSeparator = "\n\n") {
const docStrings: string[] = await Promise.all(
docs.map((doc) => doc.pageContent)
);
return docStrings.join(documentSeparator);
}

// Your Zep Session ID.
const sessionId = "<Zep Session ID>";

const collectionName = "<Zep Collection Name>";

const zepClient = new ZepClient({
// Your Zep Cloud Project API key https://help.getzep.com/projects
apiKey: "<Zep Api Key>",
});

const vectorStore = await ZepCloudVectorStore.init({
client: zepClient,
collectionName,
});

const prompt = ChatPromptTemplate.fromMessages([
[
"system",
`Answer the question based only on the following context and conversation history: {context}`,
],
new MessagesPlaceholder("history"),
["human", "{question}"],
]);

const model = new ChatOpenAI({
temperature: 0.8,
modelName: "gpt-3.5-turbo-1106",
});
const retriever = vectorStore.asRetriever();
const searchQuery = new RunnableLambda({
func: async (input: any) => {
// You can use zep to synthesize a question based on the user input and session context.
// It can be useful because sometimes the user will type something like "yes" or "ok", which is not very useful for vector store retrieval.
const { question } = await zepClient.memory.synthesizeQuestion(
input.session_id
);
console.log("Synthesized question: ", question);
return question;
},
});
const retrieverLambda = new RunnableLambda({
func: async (question: string) => {
const response = await retriever.invoke(question);
return combineDocuments(response);
},
});
const setupAndRetrieval = RunnableMap.from({
context: searchQuery.pipe(retrieverLambda),
question: (x: any) => x.question,
history: (x: any) => x.history,
});
const outputParser = new StringOutputParser();

const ragChain = setupAndRetrieval.pipe(prompt).pipe(model).pipe(outputParser);

const invokeChain = (chainInput: ChainInput) => {
const chainWithHistory = new RunnableWithMessageHistory({
runnable: RunnablePassthrough.assign({
session_id: () => chainInput.sessionId,
}).pipe(ragChain),
getMessageHistory: (sessionId) =>
new ZepCloudChatMessageHistory({
client: zepClient,
sessionId,
memoryType: "perpetual",
}),
inputMessagesKey: "question",
historyMessagesKey: "history",
});

return chainWithHistory.invoke(
{ question: chainInput.question },
{
configurable: {
sessionId: chainInput.sessionId,
},
}
);
};

const chain = new RunnableLambda({
func: invokeChain,
}).withConfig({
callbacks: [new ConsoleCallbackHandler()],
});

const result = await chain.invoke({
question: "Project Gutenberg",
sessionId,
});

console.log("result", result);

API 参考

内存使用情况

import { ChatOpenAI } from "@langchain/openai";
import { ConversationChain } from "langchain/chains";
import { ZepCloudMemory } from "@langchain/community/memory/zep_cloud";
import { randomUUID } from "crypto";

const sessionId = randomUUID(); // This should be unique for each user or each user's session.

const memory = new ZepCloudMemory({
sessionId,
// Your Zep Cloud Project API key https://help.getzep.com/projects
apiKey: "<Zep Api Key>",
});

const model = new ChatOpenAI({
modelName: "gpt-3.5-turbo",
temperature: 0,
});

const chain = new ConversationChain({ llm: model, memory });
console.log("Memory Keys:", memory.memoryKeys);

const res1 = await chain.invoke({ input: "Hi! I'm Jim." });
console.log({ res1 });
/*
{
res1: {
text: "Hello Jim! It's nice to meet you. My name is AI. How may I assist you today?"
}
}
*/

const res2 = await chain.invoke({ input: "What did I just say my name was?" });
console.log({ res2 });

/*
{
res1: {
text: "You said your name was Jim."
}
}
*/
console.log("Session ID: ", sessionId);
console.log("Memory: ", await memory.loadMemoryVariables({}));

API 参考


此页面是否有帮助?


您也可以留下详细的反馈 在 GitHub 上.