Home > Technology peripherals > AI > body text

Open source large models must surpass closed source - LeCun reveals 2024 AI trend chart

PHPz
Release: 2023-12-15 10:23:55
forward
718 people have browsed it

2023 is coming to an end. Over the past year, various large models have been released. While technology giants such as OpenAI and Google are competing, another "power" is quietly rising - open source.

The open source model has always been questioned a lot. Are they as good as proprietary models? Can it match the performance of proprietary models? So far, we've been able to say that we're only somewhat close. Even so, open source models will always bring us empirical performance, which makes us admire with admiration.

The rise of the open source model is changing the rules of the game. Meta’s LLaMA series, for example, is gaining popularity for its rapid iteration, customizability, and privacy. These models are being rapidly developed by the community, creating a powerful challenge to proprietary models and capable of changing the competitive landscape of large technology companies.

But before, most of people’s ideas just came from “feelings”. This morning, Meta chief AI scientist and Turing Award winner Yann LeCun suddenly lamented: "Open source artificial intelligence models are on the road to surpassing proprietary models."

Open source large models must surpass closed source - LeCun reveals 2024 AI trend chart

This trend chart produced by the ARK Invest team is considered to possibly predict the development of artificial intelligence in 2024. It showcases the growth of open source communities versus proprietary models in generative artificial intelligence

Open source large models must surpass closed source - LeCun reveals 2024 AI trend chart

As companies like OpenAI and Google become more insular, they Information about the latest models is being made public less and less frequently. As a result, the open source community and its corporate backer Meta are starting to step in to democratize generative AI, which may pose a challenge to the business model of proprietary models.

In this scatter plot The performance percentages of various AI models are shown in . Proprietary models are shown in blue and open source models in black. We can see the performance of different AI models such as GPT-3, Chinchilla 70B (Google), PaLM (Google), GPT-4 (OpenAI), and Llama65B (Meta) at different points in time.

When Meta originally released LLaMA, the number of parameters ranged from 7 billion to 65 billion. The performance of these models is excellent: the Llama model with 13 billion parameters can outperform GPT-3 (175 billion parameters) "on most benchmarks" and can run on a single V100 GPU; while the largest 65 billion The parameters of the Llama model are comparable to Google's Chinchilla-70B and PaLM-540B.

Falcon-40B shot to the top of Huggingface’s OpenLLM rankings as soon as it was released, changing the scene where Llama stands out.

Open source large models must surpass closed source - LeCun reveals 2024 AI trend chart

Llama 2 is open source, once again causing great changes in the large model landscape. Compared with Llama 1, Llama 2 has 40% more training data, doubles the context length, and adopts a grouped query attention mechanism.

Recently, the open source large model universe has gained a new heavyweight member - the Yi model. It can handle 400,000 Chinese characters at a time, and both Chinese and English dominate the list. Yi-34B has also become the only domestic model to successfully top the Hugging Face open source model rankings so far.

According to the scatter plot, the performance of the open source model continues to catch up with the proprietary model. This means that in the near future, the open source model is expected to be on par with, or even surpass, the performance of the proprietary model. received high praise from researchers, who stated that "the closed-source large model has come to an end."

Some netizens have already begun to wish "2024" Becoming the Year of Open Source Artificial Intelligence", he believes that "we are approaching a tipping point. Considering the current development speed of open source community projects, we expect to reach the level of GPT-4 within the next 12 months." Open source large models must surpass closed source - LeCun reveals 2024 AI trend chart

Next, we will wait and see whether the future of the open source model is smooth sailing and what kind of performance it will showOpen source large models must surpass closed source - LeCun reveals 2024 AI trend chart

Open source large models must surpass closed source - LeCun reveals 2024 AI trend chart

The above is the detailed content of Open source large models must surpass closed source - LeCun reveals 2024 AI trend chart. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:51cto.com
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template