Home > Technology peripherals > AI > HuggingFace: Two alpacas are spliced ​​together after removing their heads and tails

HuggingFace: Two alpacas are spliced ​​together after removing their heads and tails

WBOY
Release: 2024-01-15 21:09:05
forward
1147 people have browsed it

HuggingFace’s open source large model rankings have been eliminated again.

The front row is exclusively occupied by the SOLAR 10.7B fine-tuned version, squeezing out the various Mixtral 8x7B fine-tuned versions from a few weeks ago.

HuggingFace: Two alpacas are spliced ​​together after removing their heads and tails

#What is the origin of the large SOLAR model?

The related paper has just been uploaded to ArXiv, from the Korean company Upstage AI, using the new large model expansion method depth up-scaling(DUS).

HuggingFace: Two alpacas are spliced ​​together after removing their heads and tails

To put it simply, it’sTwo 7B alpacas cut off the head and tail, one cut off the first 8 layers, and the other Only cut off the last 8 layers.

The remaining two 24 layers are stitched together, the 24th layer of the first model is spliced ​​with the 9th layer of the second model, and finally becomes New 48-story 10.7B large model.

HuggingFace: Two alpacas are spliced ​​together after removing their heads and tails

The paper claims that the new method surpasses traditional extension methods such as MoE, and can use exactly the same infrastructure as the basic large model.

There is no need for additional modules such as gated networks, the training framework is optimized for MoE, and there is no need to customize CUDA kernels for fast inference. It can be seamlessly integrated into existing methods while maintaining efficiency.

The team chose Mistral 7B, the strongest single large model of 7B, as the base material, and used new methods to splice it together to surpass the original version and the MoE version.

At the same time, the aligned Instruct version also surpasses the corresponding MoE Instruct version.

HuggingFace: Two alpacas are spliced ​​together after removing their heads and tails

Carry out the stitching to the end

Why is this splicing method? The introduction in the paper comes from an intuition.

Start with the simplest expansion method, which is to repeat the 32-layer basic large model twice to become 64 layers.

The advantage of this is that there is no heterogeneity, all layers are from the base large model, but the seams of layer 32 and layer 33 (same as layer 1) There is a larger "layer distance"(layer distance).

Previous research has shown that different layers of Transformer do different things. For example, deeper layers are better at processing more abstract concepts.

The team believes that too large a layer distance may hinder the model's ability to effectively utilize pre-trained weights.

One potential solution was to sacrifice the middle layer, thereby reducing the difference at the seams, and this is where the DUS method was born.

Based on the trade-off between performance and model size, the team chose to delete 8 layers from each model, and the seams were changed from 32 layers to layer 1 to 24 layers to layer 9.

The performance of the simply spliced ​​model will still be lower than the original base model at first, but it can recover quickly after continued pre-training.

In the instruction fine-tuning phase, in addition to using open source data sets, a mathematically enhanced data set was also produced, and DPO was used in the alignment phase.

The last step is to weight the average of the model versions trained using different data sets, which is also the completion of the stitching.

HuggingFace: Two alpacas are spliced ​​together after removing their heads and tails

Some netizens questioned the possibility of test data leakage.

HuggingFace: Two alpacas are spliced ​​together after removing their heads and tails

The team also took this into consideration and specifically reported the data pollution test results in the appendix of the paper, which showed a low level.

HuggingFace: Two alpacas are spliced ​​together after removing their heads and tails

Finally, both the SOLAR 10.7B basic model and the fine-tuned model are open source under the Apache 2.0 license.

Netizens who have tried it have reported that it performs well in extracting data from JSON format data.

HuggingFace: Two alpacas are spliced ​​together after removing their heads and tails

Paper address: https://arxiv.org/abs/2312.15166

The above is the detailed content of HuggingFace: Two alpacas are spliced ​​together after removing their heads and tails. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:51cto.com
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template