Home > Technology peripherals > AI > Quickly gaining 2,500 stars, Andrej Karpathy rewrote a minGPT library

Quickly gaining 2,500 stars, Andrej Karpathy rewrote a minGPT library

WBOY
Release: 2023-04-13 14:01:06
forward
1156 people have browsed it

As a representative work of "violent aesthetics" in the field of artificial intelligence, GPT can be said to have stolen the limelight. From the 117 million parameters of GPT at the beginning of its birth, it has soared all the way to 175 billion parameters of GPT-3. With the release of GPT-3, OpenAI has opened its commercial API to the community, encouraging everyone to try more experiments using GPT-3. However, use of the API requires an application, and your application is likely to come to nothing.

In order to allow researchers with limited resources to experience the fun of playing with large models, former Tesla AI director Andrej Karpathy wrote it based on PyTorch with only about 300 lines of code. A small GPT training library was developed and named minGPT. This minGPT can perform addition operations and character-level language modeling, and the accuracy is not bad.

After two years, minGPT has been updated, and Karpathy has launched a new version named NanoGPT. This library is used to train and fine-tune medium-sized GPT. In just a few days since it was launched, it has collected 2.5K stars.

Quickly gaining 2,500 stars, Andrej Karpathy rewrote a minGPT library


## Project address: https://github.com/karpathy/nanoGPT

In the project introduction, Karpathy wrote: "NanoGPT is the simplest and fastest library for training and fine-tuning medium-scale GPT. It is a rewrite of minGPT, because minGPT It's so complex that I don't want to use it anymore. NanoGPT is still under development and is currently working on reproducing GPT-2 on the OpenWebText dataset.

NanoGPT code design goals It is simple and easy to read, among which train.py is a code of about 300 lines; model.py is a GPT model definition of about 300 lines, which can choose to load GPT-2 weights from OpenAI."

Quickly gaining 2,500 stars, Andrej Karpathy rewrote a minGPT library

In order to render the data set, the user first needs to tokenize some documents into a simple 1D index array.

$ cd data/openwebtext
$ python prepare.py
Copy after login

This will generate two files: train.bin and val.bin, each containing a raw sequence of uint16 bytes representing the GPT-2 BPE token id. This training script attempts to replicate the smallest version of GPT-2 provided by OpenAI, which is the 124M version.

$ python train.py
Copy after login

If you want to use PyTorch distributed data parallelism (DDP) for training, please use torchrun to run the script.

$ torchrun --standalone --nproc_per_node=4 train.py
Copy after login

To make the code more efficient, users can also sample from the model:

$ python sample.py
Copy after login

Karpathy said the project is currently in 1 The training loss for one night on the A100 40GB GPU is about 3.74, and the training loss on 4 GPUs is about 3.60. Training dropped to 3.1 atm with 400,000 iterations (~1 day) on 8 x A100 40GB nodes.

As for how to fine-tune GPT on new text, users can visit data/shakespeare and look at prepare.py. Unlike OpenWebText, this will run in seconds. Fine-tuning takes very little time, e.g. just a few minutes on a single GPU. The following is an example of running fine-tuning

$ python train.py config/finetune_shakespeare.py
Copy after login

As soon as the project went online, someone has already started trying it:

Quickly gaining 2,500 stars, Andrej Karpathy rewrote a minGPT library

Friends who want to try it can refer to the original project.

The above is the detailed content of Quickly gaining 2,500 stars, Andrej Karpathy rewrote a minGPT library. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:51cto.com
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template