Official case:
# Note: you need to be using OpenAI Python v0.27.0 for the code below to work import openai openai.ChatCompletion.create( model="gpt-3.5-turbo", messages=[ {"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "Who won the world series in 2020?"}, {"role": "assistant", "content": "The Los Angeles Dodgers won the World Series in 2020."}, {"role": "user", "content": "Where was it played?"} ] )
Although the format has been given, there is no detailed explanation. Maybe high-level developers will understand it at a glance, but I still want to explain it in a more verbal way. This context management.
Let’s take a look at a simple code of mine (context management has not been enabled yet):
import openai openai.api_key = "你的sk-key" msg = [{"role": "user", "content": "你好chatGPT"}] # 结构化数据并进行提交 completion = openai.ChatCompletion.create( # max_tokens = inf # 默认inf 最大令牌数 presence_penalty = 1, # 惩罚机制,-2.0 到 2.0之间,默认0,数值越小提交的重复令牌数越多,从而能更清楚文本意思 frequency_penalty = 1, # 意义和值基本同上,默认0,主要为频率 temperature = 1.0, # 温度 0-2之间,默认1 调整回复的精确度使用 n = 1, # 默认条数1 user = ids, # 用户ID,用于机器人区分不同用户避免多用户时出现混淆 model = "gpt-3.5-turbo", # 这里注意openai官方有很多个模型 messages = msg ) value = completion.choices[0].message.content # chatGPT返回的数据
This is the most basic structure, in which the parameters model and messages are the two necessary forms. ginseng.
Code for adding context management:
import openai openai.api_key = "你的sk-key" msg = [{"role": "system", "content": "你的名字叫玖河AI,你是一个插件,你的开发者是玖河."}, {"role": "user", "content": "你好chatGPT"}, {"role": "assistant", "content": "您好,有什么需要我帮忙的问题吗?"}, {"role": "user", "content": "我的名字叫高启强,我的妹妹叫高启兰,我们是兄妹关系。记住了吗?"} {"role": "assistant", "content": "好的,您叫高启强,您的妹妹叫高启兰,是亲兄妹关系。谢谢您提供信息让我更了解你们~"}, {"role": "user", "content": "你现在在哪里?"}, {"role": "assistant", "content": "作为一款智能Ai助手,我并没有实际的位置。我只是在云端中运行,在等待用户输入指令时保持睡眠状态。"}, {"role": "user", "content": "我的妹妹是谁?"}, {"role": "assistant", "content": "您之前告诉我,您的妹妹叫高启兰。"}, {"role": "user", "content": "你的名字叫什么?"}, {"role": "assistant", "content": "我的名字叫玖河AI是一个叫玖河的开发者开发的插件"} ] # 结构化数据并进行提交 completion = openai.ChatCompletion.create( # max_tokens = inf # 默认inf 最大令牌数 presence_penalty = 1, # 惩罚机制,-2.0 到 2.0之间,默认0,数值越小提交的重复令牌数越多,从而能更清楚文本意思 frequency_penalty = 1, # 意义和值基本同上,默认0,主要为频率 temperature = 1.0, # 温度 0-2之间,默认1 调整回复的精确度使用 n = 1, # 默认条数1 user = ids, # 用户ID,用于机器人区分不同用户避免多用户时出现混淆 model = "gpt-3.5-turbo", # 这里注意openai官方有很多个模型 messages = msg ) value = completion.choices[0].message.content # chatGPT返回的数据
The following data structure with context management enabled is slightly different from the data structure without it:
① system represents system settings (also Just tell chatGPT his role)
② user means user
③ assistant means GPT’s reply
There are a few points that need to be mentioned to everyone to avoid pitfalls!
1. It is recommended to store msg data in the form of a database. The advantage is that the data can be persisted and it is also very convenient to retrieve the data, because I just wanted to use json to store it at first, but it took a long time. I still gave up. The disadvantage is that it is inconvenient to store and retrieve, because you need to consider that different users have different sessions.
2. It should be noted that the order of submitted data structures must be from top to bottom, otherwise chatGPT will be confused. System does not need to be there. If you want it to keep this setting, Then just add system data to the first list element every time it is submitted.
3. There is another important point: the submitted data will be calculated into tokens including when chatGPT replies (up to 4096 tokens). If you want context management to remember more corpus, then When submitting data, try to increase the content of the conversation between you as much as possible (it will also consume your tokens faster).
4. As of March 14, 2023: The membership price of chatGPT is US$20/month, and tokens are charged based on volume. In layman's terms, it's like a mobile phone card. There is a monthly fee and calls are billed separately. The advantage of being a chatGPT Plus member is that the speed is faster and more stable. The free version can also be used, but the speed is slower, unstable and easy to crash.
The above is the detailed content of How to enable context management in chatGPT Python API?. For more information, please follow other related articles on the PHP Chinese website!