Home > Technology peripherals > AI > body text

Yuanxiang XVERSE-65B: The largest open source model in China is here, with high performance and unconditional free commercial use

PHPz
Release: 2023-11-06 15:33:20
forward
1117 people have browsed it

Several large models with 7 to 13 billion parameters have been open sourced in China before, and the implementation results have emerged, and the open source ecosystem has been initially established. As the complexity and data volume of tasks such as agents increase, the industry and community's demand for larger models becomes increasingly urgent.

Research shows that the higher the number of parameters and the more high-quality training data, the more performance of large models can be continuously improved. The general consensus in the industry is that when reaches the 50 to 60 billion parameter threshold, large models can "smartly emerge" and demonstrate powerful performance in multi-tasking. However, training a model of this magnitude is expensive and requires high technical requirements. Currently, it is mainly provided as a closed-source paid model.

In the foreign open source ecosystem, benchmark models such as Llama2-70B and Falcon-180B are conditionally open source, with commercial upper limits set on the number of monthly active users or income, and have obvious limitations in Chinese language capabilities due to lack of training data. Short board. In addition, the United States’ recently promulgated AI chip ban may further restrict the development speed of China’s large model industry. The industry urgently calls for a high-performance large-scale domestic model to fill the ecological gap and provide more powerful understanding, reasoning and long text generation capabilities for Chinese applications.

In this context, Yuanxiang XVERSE Company announcedthe open source 65 billion parameter high-performance universal large model . In addition, the 13B model has been fully upgraded to increase the upper limit of small model capabilities. This will allow a large number of small and medium-sized enterprises, researchers and AI developers to realize the freedom of large models earlier. They can freely use, modify or distill Yuanxiang large models according to their computing power, resource constraints and specific task requirements, promoting breakthroughs in research and application. Innovation.

Model address: https://huggingface.co/xverse/XVERSE-65BYuanxiang XVERSE-65B: The largest open source model in China is here, with high performance and unconditional free commercial use
Yuanxiang XVERSE founder Yao Xing said: "In the face of R&D Faced with challenges such as tight time and continued shortage of computing power, the team relied on its rich experience to develop multiple high-performance 7B and 13B models within three months, and was the first to present a promising 65B model to the community, creating triple value for research, business and ecology. ."

Specifically, the 65B model can have the following positive impacts:

In terms of research and development, 65B will provide new technologies, new tools, performance optimization and model security. "Big leverage" allows the community to quickly accumulate experience, and also helps promote the long-term goal of independent and controllable national science and technology.
  • Commercially, a large number of small and medium-sized enterprises can use "big tools" at zero cost, which can break through limitations and promote significant innovation in applications. Yuanxiang also provides insights into use cases, security model deployments and potential opportunities.
  • In terms of developer ecology, the community can give full play to the advantages of organizational synergy and promote the "Cambrian Explosion" of R&D applications.
Yuanxiang XVERSE-65B: The largest open source model in China is here, with high performance and unconditional free commercial use
# This open source can be free of charge. Self-researched by the chain, multiple technological innovations

More than 40 languages. Yuanxiang adheres to high-performance positioning and has significantly improved 65B's capabilities in three aspects:

Basic capabilities such as understanding, generation, reasoning and memory, to model diversity, Creativity and precision performance, from excellent to powerful;

Expands the capabilities of tool calling, code explanation, reflection and correction, etc., laying a technical foundation for building intelligent agents and improving the practicality of the model;

  • Significantly alleviates common and potentially serious hallucination problems in 7B and 13B, reduces large model hallucinations, and improves accuracy and professionalism.

  • Yuanxiang large model series are self-developed across the entire chain, covering a number of key technologies and R&D innovations:

  • 1. Complex distributed system design: learn from team R&D Rich experience in large systems such as Tencent Go AI "Jue Yi" and Honor of Kings AI "Jue Wu", self-developed key technologies such as efficient operators, memory optimization, parallel scheduling strategies, data-computing-communication overlap, platform and framework collaboration, etc. To create an efficient and stable training system, the peak computing power utilization rate of the kilocalorie cluster reaches 58.5%, ranking among the top in the industry.
  • 2. Comprehensively improve performance: FlashAttention2 is used to accelerate calculations in 65B training, and virtual pipeline technology is used based on 3D parallelism to reduce the excessive bubble rate generated by long pipelines and improve computational inference efficiency; context window The length has been gradually increased from 8K to 16K, which not only enables it to successfully complete complex tasks, including long text understanding, long text generation and super long dialogues, but also expands tool calling, code explanation and reflection and correction capabilities, and can better build intelligent agents ( AI Agent).

    3. Extremely improve training stability: Due to the huge amount of calculations, communication congestion, chip overheating or computing node failures have become the norm for 65B training. In the early days, there were up to eight failures a week. Through continuous optimization of cluster infrastructure operation, resource scheduling, training framework and scheduling platform collaboration, Yuanxiang has created a training system with high stability, low interruption and strong fault tolerance, increasing the weekly effective training rate to 98.6%.

    In addition, in the middle of model training with nearly 1.6 trillion Tokens, the loss function produced NaN values, which may cause training to be interrupted. Normally, the industry generally deletes the relevant data intervals after analysis. Based on experience, the team determined that this was the natural evolution of the model, chose not to delete the data, and directly skipped the relevant parameter updates. Finally, the NaN value problem was solved. Later further analysis of intermediate states such as parameter values, activation values, and gradient values ​​showed that the problem may be related to the change in the maximum value of the activation value of the transformer block in the last layer of the model, and will be resolved by itself as the maximum value gradually decreases.

    Yuanxiang XVERSE-65B: The largest open source model in China is here, with high performance and unconditional free commercial use

    This experience nan value problem research and development experience

    # Comprehensive evaluation 65B performance is comparable to GPT3.5

    In order to ensure that the industry can have a comprehensive, objective, and long-term understanding of the performance of the Yuanxiang large model, the researchers referred to a series of authoritative academic evaluations and developed a system covering question and answer, understanding, knowledge, reasoning, and mathematics. 11 mainstream authoritative evaluation standards in six dimensions, including , code, etc., will continue to be used and iterated.

    XVERSE-65B There is no model of the same level in China that can be compared. In the comparative evaluation with foreign benchmarks, some indicators surpassed and the comprehensive performance was comparable to GPT3.5; it comprehensively surpassed the open source benchmarks Llama2-70B and Falcon-180B ; There is still a gap with GPT4.

    Yuanxiang XVERSE-65B: The largest open source model in China is here, with high performance and unconditional free commercial use

    ##                                                                                                                A large amount of high-quality data has been added, with the training data reaching 3.2 trillion, which greatly increases the upper limit of the capabilities of small models. It has both liberal arts and science, maintaining its advantages in liberal arts. Questions and answers have improved by 18%, science has made great progress, coding has improved by 149%, and mathematics has improved by 198%. In the evaluation, it has completely surpassed domestic and foreign open source benchmarks such as Llama2 and Baichuan2.

    ## x XVERSE-13B-2 evaluation Yuanxiang XVERSE-65B: The largest open source model in China is here, with high performance and unconditional free commercial use

    ## Open the new era of big model application

    The Yuanxiang large model can be downloaded by searching for "XVERSE" on Github, Hugging Face, ModelScope and other platforms

    ,After simple registration, you can unconditionally Free for commercial use, it can meet most of the application and iteration needs of small and medium-sized enterprises, scientific research institutions and individual developers.

    Yuanxiang also provides a full range of technical services such as model training, inference, deployment, and fine-tuning, empowering various industries such as entertainment, finance, and medical care, and helping in multiple scenarios such as intelligent customer service, creative writing, and accurate recommendations. Create an industry-leading user experience. In October 2023, Tencent Music took the lead in announcing a strategic cooperation with Yuanxiang Model, jointly launched the lyraXVERSE accelerated model, and comprehensively upgraded its music assistant "AI Xiaoqin". In the future, it will continue to explore AI and 3D cutting-edge technologies to lead music entertainment Innovative direction. Yao It is like the driving force for continuous exploration of cutting-edge technology. The XVERSE open source series is committed to promoting the domestic substitution and continuous technological innovation of large models, injecting strong impetus into the development of the real economy and the digital economy. We look forward to joining hands with enterprises and developers to jointly open up a new era of large model applications. Times!"

    About Yuanxiang

    Yuanxiang XVERSE was established in Shenzhen in early 2021. It is a leading domestic AI and 3D technology service company, committed to creating AI-driven A one-stop platform for 3D content production and consumption, with the vision of "defining your world".

    Yuanxiang XVERSE-65B: The largest open source model in China is here, with high performance and unconditional free commercial use                                                                                        Official website: www. Yao Xing, the founder of Yuanxiang, is the former vice president of Tencent and founder of Tencent AI Lab, and a member of the New Generation Artificial Intelligence Strategic Advisory Committee of the Ministry of Science and Technology. In the field of 3D and AI technology, Yuanxiang has profound accumulation and complete layout. In the 3D field, we have independently developed the industry-leading "device-cloud collaboration" 3D interactive technology, creating a zero-threshold (light), one-stop (fast), high-quality (American) new 3D experience; in the AI ​​field, open source China The largest commercially available large model XVERSE-65B with maximum parameters is designed to promote the development of domestic substitution and industry application of large models.

The above is the detailed content of Yuanxiang XVERSE-65B: The largest open source model in China is here, with high performance and unconditional free commercial use. For more information, please follow other related articles on the PHP Chinese website!

source:jiqizhixin.com
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template