Home > Technology peripherals > AI > body text

Domestic open source MoE indicators explode: GPT-4 level capabilities, API price is only one percent

PHPz
Release: 2024-05-07 17:34:01
forward
541 people have browsed it

The latest large-scale domestic open source MoE model has become popular just after its debut.

DeepSeek-V2's performance reaches GPT-4 level, but it is open source, free for commercial use, and the API price is only one percent of GPT-4-Turbo.

So once it was released, it immediately triggered a lot of discussion.

Domestic open source MoE indicators explode: GPT-4 level capabilities, API price is only one percentPicture

Judging from the published performance indicators, DeepSeek V2’s comprehensive Chinese capabilities surpass those of many open source models. At the same time, GPT-4 Turbo, Wenkuai Closed source models such as 4.0 are also in the first echelon.

The comprehensive English ability is also in the first echelon with LLaMA3-70B, and surpasses Mixtral 8x22B, which is also a MoE.

also shows good performance in knowledge, mathematics, reasoning, programming, etc. And supports 128K context.

Domestic open source MoE indicators explode: GPT-4 level capabilities, API price is only one percentPicture

These capabilities can be used directly by ordinary users for free. The closed beta is now open, you can experience it immediately after registering.

Domestic open source MoE indicators explode: GPT-4 level capabilities, API price is only one percentPicture

API is even more price-breaking: 1 yuan for input and 2 yuan for output per million tokens (32K context). The price is only nearly one percent of GPT-4-Turbo.

At the same time, the model architecture is also innovated, and the self-developed MLA (Multi-head Latent Attention) and Sparse structures are adopted, which can greatly reduce the amount of model calculation and inference memory.

Netizens lamented: DeepSeek always brings surprises to people!

Domestic open source MoE indicators explode: GPT-4 level capabilities, API price is only one percentPicture

We have been the first to experience the specific effect!

Test it

Currently, the V2 internal beta version can experience the universal dialogue and code assistant.

Domestic open source MoE indicators explode: GPT-4 level capabilities, API price is only one percentPicture

In the general dialogue, you can test the logic, knowledge, generation, mathematics and other abilities of the large model.

For example, you can ask it to imitate the style of "The Legend of Zhen Huan" to write lipstick planting copywriting.

Domestic open source MoE indicators explode: GPT-4 level capabilities, API price is only one percentPicture

It can also explain in a popular way what quantum entanglement is.

Domestic open source MoE indicators explode: GPT-4 level capabilities, API price is only one percentPicture

In terms of mathematics, you can answer high-level calculus questions, such as:

Use calculus to prove natural pairs The infinite series representation of the base e of a number.

Domestic open source MoE indicators explode: GPT-4 level capabilities, API price is only one percentPicture

can also avoid some language logic traps.

Domestic open source MoE indicators explode: GPT-4 level capabilities, API price is only one percentPicture

The test shows that the knowledge content of DeepSeek-V2 has been updated to 2023.

Domestic open source MoE indicators explode: GPT-4 level capabilities, API price is only one percentPicture

In terms of code, the internal test page shows that DeepSeek-Coder-33B is used to answer questions.

In terms of generating simpler codes, there were no errors in several actual tests.

Domestic open source MoE indicators explode: GPT-4 level capabilities, API price is only one percentPictures

can also explain and analyze the given code.

Domestic open source MoE indicators explode: GPT-4 level capabilities, API price is only one percentPicture

Domestic open source MoE indicators explode: GPT-4 level capabilities, API price is only one percentPicture

However, there are also cases of wrong answers in the test.

For the following logical question, during the calculation process of DeepSeek-V2, the time it takes for a candle to be lit from both ends at the same time and burn out is calculated as one-quarter of the time it takes for it to burn out from one end.

Domestic open source MoE indicators explode: GPT-4 level capabilities, API price is only one percentPicture

What upgrades does it bring?

According to the official introduction, DeepSeek-V2 uses 236B total parameters and 21B activation, roughly reaching the model capability of 70B~110B Dense.

Domestic open source MoE indicators explode: GPT-4 level capabilities, API price is only one percentPicture

Compared with the previous DeepSeek 67B, it has stronger performance and lower training cost, which can save 42.5% of the training cost and reduce With 93.3% KV cache, the maximum throughput is increased to 5.76 times.

Officially stated that this means that the video memory (KV Cache) consumed by DeepSeek-V2 is only 1/5~1/100 of the Dense model of the same level, and the cost per token is significantly reduced.

A lot of communication optimization has been done specifically for H800 specifications. It is actually deployed on an 8-card H800 machine. The input throughput exceeds 100,000 tokens per second and the output exceeds 50,000 tokens per second.

Domestic open source MoE indicators explode: GPT-4 level capabilities, API price is only one percentPicture

On some basic Benchmarks, the performance of the DeepSeek-V2 basic model is as follows:

Domestic open source MoE indicators explode: GPT-4 level capabilities, API price is only one percentPicture

DeepSeek-V2 adopts an innovative architecture.

Proposed MLA (Multi-head Latent Attention) architecture to significantly reduce the amount of calculation and inference memory.

At the same time, the Sparse structure is self-developed to further reduce the calculation amount.

Domestic open source MoE indicators explode: GPT-4 level capabilities, API price is only one percentPicture

Some people have said that these upgrades may be very helpful for large-scale computing in data centers.

Domestic open source MoE indicators explode: GPT-4 level capabilities, API price is only one percentPicture

And in terms of API pricing, DeepSeek-V2 is almost lower than all star models on the market.

Domestic open source MoE indicators explode: GPT-4 level capabilities, API price is only one percentPicture

The team stated that the DeepSeek-V2 model and paper will also be fully open source. Model weights and technical reports are given.

Log in to the DeepSeek API open platform now and register to get 10 million input/5 million output Tokens as a gift. The normal trial is completely free.

The above is the detailed content of Domestic open source MoE indicators explode: GPT-4 level capabilities, API price is only one percent. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:51cto.com
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template
About us Disclaimer Sitemap
php.cn:Public welfare online PHP training,Help PHP learners grow quickly!