跳至主要内容

Minimax

Minimax 是一家中国初创公司,为公司和个人提供自然语言处理模型。

本示例演示如何使用 LangChain.js 与 Minimax 进行交互。

设置

要使用 Minimax 模型,您需要一个 Minimax 帐户、一个 API 密钥 和一个 组 ID

npm install @langchain/community @langchain/core
提示

我们正在统一所有包中的模型参数。我们现在建议使用 model 而不是 modelName,并使用 apiKey 用于 API 密钥。

基本用法

import { ChatMinimax } from "@langchain/community/chat_models/minimax";
import { HumanMessage } from "@langchain/core/messages";

// Use abab5.5
const abab5_5 = new ChatMinimax({
model: "abab5.5-chat",
botSetting: [
{
bot_name: "MM Assistant",
content: "MM Assistant is an AI Assistant developed by minimax.",
},
],
});
const messages = [
new HumanMessage({
content: "Hello",
}),
];

const res = await abab5_5.invoke(messages);
console.log(res);

/*
AIChatMessage {
text: 'Hello! How may I assist you today?',
name: undefined,
additional_kwargs: {}
}
}
*/

// use abab5
const abab5 = new ChatMinimax({
proVersion: false,
model: "abab5-chat",
minimaxGroupId: process.env.MINIMAX_GROUP_ID, // In Node.js defaults to process.env.MINIMAX_GROUP_ID
minimaxApiKey: process.env.MINIMAX_API_KEY, // In Node.js defaults to process.env.MINIMAX_API_KEY
});

const result = await abab5.invoke([
new HumanMessage({
content: "Hello",
name: "XiaoMing",
}),
]);
console.log(result);

/*
AIMessage {
lc_serializable: true,
lc_kwargs: {
content: 'Hello! Can I help you with anything?',
additional_kwargs: { function_call: undefined }
},
lc_namespace: [ 'langchain', 'schema' ],
content: 'Hello! Can I help you with anything?',
name: undefined,
additional_kwargs: { function_call: undefined }
}
*/

API 参考

链模型调用

import { LLMChain } from "langchain/chains";
import { ChatMinimax } from "@langchain/community/chat_models/minimax";
import {
ChatPromptTemplate,
HumanMessagePromptTemplate,
SystemMessagePromptTemplate,
} from "@langchain/core/prompts";

// We can also construct an LLMChain from a ChatPromptTemplate and a chat model.
const chat = new ChatMinimax({ temperature: 0.01 });

const chatPrompt = ChatPromptTemplate.fromMessages([
SystemMessagePromptTemplate.fromTemplate(
"You are a helpful assistant that translates {input_language} to {output_language}."
),
HumanMessagePromptTemplate.fromTemplate("{text}"),
]);
const chainB = new LLMChain({
prompt: chatPrompt,
llm: chat,
});

const resB = await chainB.invoke({
input_language: "English",
output_language: "Chinese",
text: "I love programming.",
});
console.log({ resB });

API 参考

使用函数调用

import { ChatMinimax } from "@langchain/community/chat_models/minimax";
import { HumanMessage } from "@langchain/core/messages";

const functionSchema = {
name: "get_weather",
description: " Get weather information.",
parameters: {
type: "object",
properties: {
location: {
type: "string",
description: " The location to get the weather",
},
},
required: ["location"],
},
};

// Bind function arguments to the model.
// All subsequent invoke calls will use the bound parameters.
// "functions.parameters" must be formatted as JSON Schema
const model = new ChatMinimax({
botSetting: [
{
bot_name: "MM Assistant",
content: "MM Assistant is an AI Assistant developed by minimax.",
},
],
}).bind({
functions: [functionSchema],
});

const result = await model.invoke([
new HumanMessage({
content: " What is the weather like in NewYork tomorrow?",
name: "I",
}),
]);

console.log(result);

/*
AIMessage {
lc_serializable: true,
lc_kwargs: { content: '', additional_kwargs: { function_call: [Object] } },
lc_namespace: [ 'langchain', 'schema' ],
content: '',
name: undefined,
additional_kwargs: {
function_call: { name: 'get_weather', arguments: '{"location": "NewYork"}' }
}
}
*/

// Alternatively, you can pass function call arguments as an additional argument as a one-off:

const minimax = new ChatMinimax({
model: "abab5.5-chat",
botSetting: [
{
bot_name: "MM Assistant",
content: "MM Assistant is an AI Assistant developed by minimax.",
},
],
});

const result2 = await minimax.invoke(
[new HumanMessage("What is the weather like in NewYork tomorrow?")],
{
functions: [functionSchema],
}
);
console.log(result2);

/*
AIMessage {
lc_serializable: true,
lc_kwargs: { content: '', additional_kwargs: { function_call: [Object] } },
lc_namespace: [ 'langchain', 'schema' ],
content: '',
name: undefined,
additional_kwargs: {
function_call: { name: 'get_weather', arguments: '{"location": "NewYork"}' }
}
}
*/

API 参考

使用 Zod 的函数

import { z } from "zod";
import { zodToJsonSchema } from "zod-to-json-schema";
import { ChatMinimax } from "@langchain/community/chat_models/minimax";
import { HumanMessage } from "@langchain/core/messages";

const extractionFunctionZodSchema = z.object({
location: z.string().describe(" The location to get the weather"),
});

// Bind function arguments to the model.
// "functions.parameters" must be formatted as JSON Schema.
// We translate the above Zod schema into JSON schema using the "zodToJsonSchema" package.

const model = new ChatMinimax({
model: "abab5.5-chat",
botSetting: [
{
bot_name: "MM Assistant",
content: "MM Assistant is an AI Assistant developed by minimax.",
},
],
}).bind({
functions: [
{
name: "get_weather",
description: " Get weather information.",
parameters: zodToJsonSchema(extractionFunctionZodSchema),
},
],
});

const result = await model.invoke([
new HumanMessage({
content: " What is the weather like in Shanghai tomorrow?",
name: "XiaoMing",
}),
]);

console.log(result);

/*
AIMessage {
lc_serializable: true,
lc_kwargs: { content: '', additional_kwargs: { function_call: [Object] } },
lc_namespace: [ 'langchain', 'schema' ],
content: '',
name: undefined,
additional_kwargs: {
function_call: { name: 'get_weather', arguments: '{"location": "Shanghai"}' }
}
}
*/

API 参考

使用字形

此功能可以帮助用户强制模型以请求的格式返回内容。

import { ChatMinimax } from "@langchain/community/chat_models/minimax";
import {
ChatPromptTemplate,
HumanMessagePromptTemplate,
} from "@langchain/core/prompts";
import { HumanMessage } from "@langchain/core/messages";

const model = new ChatMinimax({
model: "abab5.5-chat",
botSetting: [
{
bot_name: "MM Assistant",
content: "MM Assistant is an AI Assistant developed by minimax.",
},
],
}).bind({
replyConstraints: {
sender_type: "BOT",
sender_name: "MM Assistant",
glyph: {
type: "raw",
raw_glyph: "The translated text:{{gen 'content'}}",
},
},
});

const messagesTemplate = ChatPromptTemplate.fromMessages([
HumanMessagePromptTemplate.fromTemplate(
" Please help me translate the following sentence in English: {text}"
),
]);

const messages = await messagesTemplate.formatMessages({ text: "我是谁" });
const result = await model.invoke(messages);

console.log(result);

/*
AIMessage {
lc_serializable: true,
lc_kwargs: {
content: 'The translated text: Who am I\x02',
additional_kwargs: { function_call: undefined }
},
lc_namespace: [ 'langchain', 'schema' ],
content: 'The translated text: Who am I\x02',
name: undefined,
additional_kwargs: { function_call: undefined }
}
*/

// use json_value

const modelMinimax = new ChatMinimax({
model: "abab5.5-chat",
botSetting: [
{
bot_name: "MM Assistant",
content: "MM Assistant is an AI Assistant developed by minimax.",
},
],
}).bind({
replyConstraints: {
sender_type: "BOT",
sender_name: "MM Assistant",
glyph: {
type: "json_value",
json_properties: {
name: {
type: "string",
},
age: {
type: "number",
},
is_student: {
type: "boolean",
},
is_boy: {
type: "boolean",
},
courses: {
type: "object",
properties: {
name: {
type: "string",
},
score: {
type: "number",
},
},
},
},
},
},
});

const result2 = await modelMinimax.invoke([
new HumanMessage({
content:
" My name is Yue Wushuang, 18 years old this year, just finished the test with 99.99 points.",
name: "XiaoMing",
}),
]);

console.log(result2);

/*
AIMessage {
lc_serializable: true,
lc_kwargs: {
content: '{\n' +
' "name": "Yue Wushuang",\n' +
' "is_student": true,\n' +
' "is_boy": false,\n' +
' "courses": {\n' +
' "name": "Mathematics",\n' +
' "score": 99.99\n' +
' },\n' +
' "age": 18\n' +
' }',
additional_kwargs: { function_call: undefined }
}
}

*/

API 参考

使用示例消息

此功能可以帮助模型更好地理解用户想要获取的返回信息,包括但不限于信息的 内容、格式和响应模式。

import { ChatMinimax } from "@langchain/community/chat_models/minimax";
import { AIMessage, HumanMessage } from "@langchain/core/messages";

const model = new ChatMinimax({
model: "abab5.5-chat",
botSetting: [
{
bot_name: "MM Assistant",
content: "MM Assistant is an AI Assistant developed by minimax.",
},
],
}).bind({
sampleMessages: [
new HumanMessage({
content: " Turn A5 into red and modify the content to minimax.",
}),
new AIMessage({
content: "select A5 color red change minimax",
}),
],
});

const result = await model.invoke([
new HumanMessage({
content:
' Please reply to my content according to the following requirements: According to the following interface list, give the order and parameters of calling the interface for the content I gave. You just need to give the order and parameters of calling the interface, and do not give any other output. The following is the available interface list: select: select specific table position, input parameter use letters and numbers to determine, for example "B13"; color: dye the selected table position, input parameters use the English name of the color, for example "red"; change: modify the selected table position, input parameters use strings.',
}),
new HumanMessage({
content: " Process B6 to gray and modify the content to question.",
}),
]);

console.log(result);

API 参考

使用插件

此功能支持调用搜索引擎等工具来获取可以帮助模型的额外数据。

import { ChatMinimax } from "@langchain/community/chat_models/minimax";
import { HumanMessage } from "@langchain/core/messages";

const model = new ChatMinimax({
model: "abab5.5-chat",
botSetting: [
{
bot_name: "MM Assistant",
content: "MM Assistant is an AI Assistant developed by minimax.",
},
],
}).bind({
plugins: ["plugin_web_search"],
});

const result = await model.invoke([
new HumanMessage({
content: " What is the weather like in NewYork tomorrow?",
}),
]);

console.log(result);

/*
AIMessage {
lc_serializable: true,
lc_kwargs: {
content: 'The weather in Shanghai tomorrow is expected to be hot. Please note that this is just a forecast and the actual weather conditions may vary.',
additional_kwargs: { function_call: undefined }
},
lc_namespace: [ 'langchain', 'schema' ],
content: 'The weather in Shanghai tomorrow is expected to be hot. Please note that this is just a forecast and the actual weather conditions may vary.',
name: undefined,
additional_kwargs: { function_call: undefined }
}
*/

API 参考


此页面是否有帮助?


您也可以留下详细的反馈 在 GitHub 上.