Home > Technology peripherals > AI > body text

Will LLM become history? Open source bGPT may subvert the deep learning paradigm: directly simulate binary, opening a new era of analog digital world!

PHPz
Release: 2024-03-13 19:20:08
forward
491 people have browsed it

The latest achievement bGPT launched by Microsoft Research Asia, this byte-based Transformer model, opens a new door for us to explore the digital world.

Unlike traditional vocabulary-based language models, bGPT is unique in that it can directly process raw binary data without being restricted by specific formats or tasks. It aims to fully simulate the digital world, opening up new possibilities for model development.

Will LLM become history? Open source bGPT may subvert the deep learning paradigm: directly simulate binary, opening a new era of analog digital world!

Paper: https://www.php.cn/link/ee88b3cea2051be97bcddf2e0d9a28f6

Code: https://www.php.cn/link/359499f804ea7988921bf86c9377fb95

Model :https://www.php.cn/link/4b459ea1a5917be436df5f0bd5b3c4ad

## Project homepage: https://www.php.cn/link/71af59614c8b42af334933e9261e53be

The research team demonstrated the huge potential of bGPT in modeling in their research paper. By performing byte-level processing, bGPT can not only generate text, images, and audio, but also simulate computer behavior, including format conversion algorithms and modeling of CPU states. This approach of treating all data as a sequence of bytes enables bGPT to integrate different types of data into the same framework.

Once released, bGPT’s paper caused widespread discussion on This activity opens up new possibilities.

Binary data: the basic DNA that constitutes the digital world

Binary data is the cornerstone of the digital world, which runs through the operation of computer processors and the electronic products we use every day The system is the core of all data, equipment and software. Therefore, based on this foundation, the goal of bGPT is to understand the internal logic of digital systems by studying binary data sequences, thereby reshaping and simulating various complex digital phenomena.

bGPT can not only be applied to conventional AI generation and understanding tasks through byte-level processing, but can also handle more non-traditional applications. For example, it can directly simulate MIDI - a standard format for music transmission and storage, which previous research has avoided direct modeling due to the binary nature of MIDI.

But bGPT is naturally suitable for such tasks. It can accurately simulate the conversion algorithm of music data and achieve an extremely low error rate (0.0011 BPB) when converting ABC notation to MIDI format. .

In practical applications, bGPT is usually able to accurately complete the conversion between ABC symbols and MIDI files, and sometimes can even correct errors in the original files to make the music conversion more accurate.

Will LLM become history? Open source bGPT may subvert the deep learning paradigm: directly simulate binary, opening a new era of analog digital world!


bGPT automatically converts ABC notation into MIDI format (above) and the original MIDI data ( The comparison of the figure below) highlights the key difference: although a beat is missing in the original MIDI data (see figure below), causing the chord accompaniment to be disconnected, the result of the bGPT conversion (see figure above) correctly fills in this is missing, ensuring the smoothness of the chord accompaniment.

The research team also regards CPU modeling as a representative task for hardware behavior simulation: this task requires the model to receive a sequence of low-level machine instructions as input, and its goal is to accurately predict the execution of each instruction. How the CPU status is updated until the program is stopped.

In this task, bGPT demonstrated an accuracy of over 99.99%, demonstrating the power and scalability of the byte model in processing native binary data.

Will LLM become history? Open source bGPT may subvert the deep learning paradigm: directly simulate binary, opening a new era of analog digital world!

Given the program and initial CPU state, bGPT is able to accurately predict the complete process of CPU execution until the program terminates. In this example, bGPT handles all CPU instructions accurately. For ease of understanding, the actual byte sequence is converted into a more readable format.

From bytes to everything: Breaking through boundaries and moving towards unified data modeling

bGPT can not only process native binary data, but also integrate multiple data types Into a unified model architecture, all data are regarded as byte sequences.

This approach not only simplifies the data modeling process, but also makes integration from any data source a breeze without the need to customize models for specific data types.

The research team gave examples of traditional text, image and audio files in the paper, demonstrating bGPT's capabilities in unified data modeling. The bGPT model they trained has about 100 million parameters.

Experimental results show that in comparison with models of the same scale as GPT-2 (text model), ViT (visual model) and AST (audio model), bGPT performs better on different data types Both demonstrated comparable performance.

bGPT performs very well in text generation. Thanks to its byte-level text encoding, the model does not rely on vocabulary and can therefore support all languages.

Its hierarchical Transformer architecture, although the computational overhead is similar to GPT-2, can generate text up to 8KB, which greatly exceeds the length limit of GPT-2. After pre-training on Wikipedia data, the text generated by bGPT is comparable to GPT-2 in both style and topic, proving its powerful ability in text generation.

bGPT is pre-trained on the Wikipedia dataset, and the quality and topic consistency of the generated text samples are comparable to GPT-2.

bGPT can generate images by predicting the next byte in a sequence of image bytes. The model is pre-trained on the ImageNet dataset, and the generated images have a resolution of 32x32 pixels.

Although at the current scale, it is difficult to accurately capture the two-dimensional spatial relationship of the image through byte sequences, resulting in artifacts and noise in the generated image, texture and light and shadow effects are usually Still relatively accurate.

In addition, these generated images can be decoded into BMP files normally. The research team pointed out that by expanding the scale of bGPT, similar to the method of pixel sequence modeling of iGPT developed by OpenAI, it may be possible to achieve higher quality and more realistic image generation.

These are a set of images generated by bGPT pre-trained on the ImageNet dataset. While the texture and lighting effects of the images are generally accurate, identifying the main objects in these generated images can be challenging.

bGPT treats audio data as a sequence of bytes and can generate 1 second long audio samples with a sampling rate of 8000 Hz.

The model was pre-trained on the LibriSpeech data set and further fine-tuned and demonstrated on the Speech Commands v2 data set. The audio samples generated by bGPT maintain a high level of accuracy, with some samples being nearly indistinguishable from real audio. The following is a set of examples demonstrating bGPT's capabilities in the field of audio generation.

Explore the digital world of bytes with bGPT

Traditional language models, no matter how powerful they are, mainly focus on processing natural language text . The bGPT model breaks the limitation of text processing through a byte-based processing mechanism and opens up a new data processing category.

This advancement gives bGPT the ability to seamlessly handle various data types including text, images, audio, and even native binary data from algorithms and hardware, providing It paves the way to fully simulate and understand the digital world.

Although bGPT has demonstrated compelling capabilities, it has limitations in terms of computational overhead. For example, it can currently only process byte sequences of up to 8KB on conventional graphics cards. This poses obvious limitations for applications that need to generate or process large amounts of data. Future work plans will focus on developing more efficient algorithms and taking advantage of advances in hardware, aiming to improve the ability to process larger data sequences.

Technology enthusiasts around the world have begun to look forward to the future potential of bGPT. From the optimization of network pruning and self-learning to the self-reconstruction capabilities of ultra-large-scale networks, these discussions point to a common Vision: bGPT may eventually achieve a unified model capable of processing and outputting all types of byte data, truly becoming a comprehensive simulator of the digital world.

Will LLM become history? Open source bGPT may subvert the deep learning paradigm: directly simulate binary, opening a new era of analog digital world!

The research team has open sourced the code and model of bGPT. This means that you can directly train bGPT on your own data set without making any adjustments to the model architecture, and explore the broad prospects of byte models in the digital field.

The above is the detailed content of Will LLM become history? Open source bGPT may subvert the deep learning paradigm: directly simulate binary, opening a new era of analog digital world!. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:51cto.com
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template
About us Disclaimer Sitemap
php.cn:Public welfare online PHP training,Help PHP learners grow quickly!