Motörhead 内存
Motörhead 是一个用 Rust 实现的内存服务器。它会在后台自动处理增量摘要,并允许使用无状态应用程序。
设置
请查看 Motörhead 上的说明,了解如何在本地运行服务器,或访问 https://getmetal.io 获取托管版本的 API 密钥。
用法
提示
请参阅 本节,了解有关安装集成包的一般说明。
- npm
- Yarn
- pnpm
npm install @langchain/openai
yarn add @langchain/openai
pnpm add @langchain/openai
import { ChatOpenAI } from "@langchain/openai";
import { ConversationChain } from "langchain/chains";
import { MotorheadMemory } from "@langchain/community/memory/motorhead_memory";
// Managed Example (visit https://getmetal.io to get your keys)
// const managedMemory = new MotorheadMemory({
// memoryKey: "chat_history",
// sessionId: "test",
// apiKey: "MY_API_KEY",
// clientId: "MY_CLIENT_ID",
// });
// Self Hosted Example
const memory = new MotorheadMemory({
memoryKey: "chat_history",
sessionId: "test",
url: "localhost:8080", // Required for self hosted
});
const model = new ChatOpenAI({
model: "gpt-3.5-turbo",
temperature: 0,
});
const chain = new ConversationChain({ llm: model, memory });
const res1 = await chain.invoke({ input: "Hi! I'm Jim." });
console.log({ res1 });
/*
{
res1: {
text: "Hello Jim! It's nice to meet you. My name is AI. How may I assist you today?"
}
}
*/
const res2 = await chain.invoke({ input: "What did I just say my name was?" });
console.log({ res2 });
/*
{
res1: {
text: "You said your name was Jim."
}
}
*/
API 参考
- ChatOpenAI 来自
@langchain/openai
- ConversationChain 来自
langchain/chains
- MotorheadMemory 来自
@langchain/community/memory/motorhead_memory