Developers can now integrate ChatGPT and Whisper models into their applications and products through our API.
In the previous API version, the text-davinci-003 version of the model was used. This model did not have the contextual dialogue function, and the generated content was much worse than ChatGPT, so the community There have also been many projects that package the web version of ChatGPT to provide services, but the stability is not very good because they rely on web pages. Now that the ChatGPT version of the API has been officially released, this is great news for developers. Of course, it is of great significance to OpenAI and even the entire industry. In the next period of time, there is bound to be a new API. A large number of excellent AI applications.
The latest externally released API is driven by gpt-3.5-turbor. This is OpenAI’s most advanced language model. Many things can be done through this API.
The new chat model needs to take a series of messages as input so that it can have The contextual dialogue function is now available, and of course you can also perform single-round tasks, the same as before.
To implement the new API, you need the v0.27.0 version of the Python package:
pip3 install openai==v0.27.0
Then you can directly use the openai package to interact with openai:
import openai openai.api_key = "sk-xxxx" response = openai.ChatCompletion.create( model="gpt-3.5-turbo", messages=[ {"role": "system", "content": "你是一个AI机器人助手。"}, {"role": "user", "content": "哪个队将赢得2023年NBA总冠军?"}, ] )
The most important thing One of the input parameters is messages, which is an array of message objects, each of which contains a role (system, user, assisttant) and message content. The entire conversation can be one message or multiple messages.
Normally, the format of the conversation is to first have a system message. The system message helps to set the behavior of the assistant. User messages are generated by the end users of our application, which are the questions we want to consult. The assistant message is the data fed back to us by openai. Of course, it can also be written by the developer.
When we reply to the last assistant message together, we will have the ability to contextualize it.
import openai openai.api_key = "sk-xxxx" response = openai.ChatCompletion.create( model="gpt-3.5-turbo", messages=[ {"role": "system", "content": "你是一个AI机器人助手。"}, {"role": "user", "content": "哪个队将赢得2023年NBA总冠军?"}, {"role": "assistant", "content": "湖人队将获得总冠军!"}, {"role": "user", "content": "谁会当选FMVP?"} ] ) result = '' for choice in response.choices: result += choice.message.content print(result)
For example, if we add the previous message here, we can finally get the contextual message:
Because prediction is actually It is difficult to make the most accurate prediction because many factors may affect this decision. However, the Lakers have many players who have the opportunity to win the FMVP award, such as LeBron James, Anthony Davis, Kyle Kuzma, etc., who may become FMVP.
The above is the detailed content of It's finally here, OpenAI officially opens ChatGPT API. For more information, please follow other related articles on the PHP Chinese website!