用OpenAI Assistant API构建智能助手的3种模式
我跟你们讲,OpenAI那个Assistant API是真香。不过这玩意儿用起来也讲究点技巧,我琢磨了好一阵子,总结出三种模式,今儿个跟你们好好唠唠。
基础模式:单轮对话
这种模式最简单,就是你问一句我答一句。我们先来看看代码长啥样:
from openai import OpenAI
client = OpenAI()
assistant = client.beta.assistants.create(
name="Python助手",
instructions="你是个Python专家,帮助解答Python相关问题。",
model="gpt-3.5-turbo"
)
thread = client.beta.threads.create()
message = client.beta.threads.messages.create(
thread_id=thread.id,
role="user",
content="Python中如何定义一个函数?"
)
run = client.beta.threads.runs.create(
thread_id=thread.id,
assistant_id=assistant.id
)
while run.status != "completed":
run = client.beta.threads.runs.retrieve(
thread_id=thread.id,
run_id=run.id
)
messages = client.beta.threads.messages.list(thread_id=thread.id)
print(messages.data[0].content[0].text.value)
这段代码先创建了个Assistant,然后建了个Thread,往里面丢了个问题。run一下,等它跑完,最后把回答打印出来。
温馨提示:这里用的是gpt-3.5-turbo模型,如果你想要更高级的,可以换成gpt-4,不过那玩意儿可贵了,小心钱包受不了。
进阶模式:多轮对话
单轮对话太low了,来点高级的,整个多轮对话。看代码:
from openai import OpenAI
client = OpenAI()
assistant = client.beta.assistants.create(
name="Python助手",
instructions="你是个Python专家,帮助解答Python相关问题。

",
model="gpt-3.5-turbo"
)
thread = client.beta.threads.create()
def ask_question(question):
message = client.beta.threads.messages.create(
thread_id=thread.id,
role="user",
content=question
)
run = client.beta.threads.runs.create(
thread_id=thread.id,
assistant_id=assistant.id
)
while run.status != "completed":
run = client.beta.threads.runs.retrieve(
thread_id=thread.id,
run_id=run.id
)
messages = client.beta.threads.messages.list(thread_id=thread.id)
return messages.data[0].content[0].text.value
print(ask_question("Python中如何定义一个函数?"))
print(ask_question("能给个具体例子吗?"))
print(ask_question("如何给函数添加参数?"))
这段代码把问答过程封装成了一个函数,你可以连着问好几个问题,它会记住前面的对话内容,回答的时候带上上下文。
高级模式:带工具的对话
这种模式最牛逼,能让AI调用一些工具。比如说,你可以让它帮你执行Python代码。看好了啊:
import json
from openai import OpenAI
client = OpenAI()
def run_python_code(code):
try:
exec(code)
except Exception as e:
return str(e)
return"代码执行成功"
assistant = client.beta.assistants.create(
name="Python助手",
instructions="你是个Python专家,帮助解答Python相关问题并执行代码。

",
model="gpt-3.5-turbo",
tools=[{
"type": "function",
"function": {
"name": "run_python_code",
"description": "执行Python代码",
"parameters": {
"type": "object",
"properties": {
"code": {
"type": "string",
"description": "要执行的Python代码"
}
},
"required": ["code"]
}
}
}]
)
thread = client.beta.threads.create()
message = client.beta.threads.messages.create(
thread_id=thread.id,
role="user",
content="帮我写个Python函数,计算斐波那契数列的第n项,然后用n=10测试一下。


"
)
run = client.beta.threads.runs.create(
thread_id=thread.id,
assistant_id=assistant.id
)
while run.status != "completed":
run = client.beta.threads.runs.retrieve(
thread_id=thread.id,
run_id=run.id
)
if run.status == "requires_action":
for tool_call in run.required_action.submit_tool_outputs.tool_calls:
if tool_call.function.name == "run_python_code":
code = json.loads(tool_call.function.arguments)["code"]
output = run_python_code(code)
client.beta.threads.runs.submit_tool_outputs(
thread_id=thread.id,
run_id=run.id,
tool_outputs=[{
"tool_call_id": tool_call.id,
"output": output
}]
)
messages = client.beta.threads.messages.list(thread_id=thread.id)
for message in messages.data:
print(f"{message.role}: {message.content[0].text.value}")
这段代码给Assistant加了个工具,能执行Python代码。你让它写个函数,它不光能给你代码,还能帮你跑一遍,看看结果对不对。
好了,今天就唠到这儿吧。这三种模式你们可以慢慢琢磨,用好了真是开发神器。不过话说回来,这API用起来还是挺烧钱的,你们悠着点用啊,别把钱包搞空了。
本文暂时没有评论,来添加一个吧(●'◡'●)