Home Technology peripherals AI With the arrival of large AI models, server prices have soared 20 times, and the investment value of the sector is highlighted?

With the arrival of large AI models, server prices have soared 20 times, and the investment value of the sector is highlighted?

May 25, 2023 pm 12:30 PM
ai large model Server price investment value

With the arrival of large AI models, server prices have soared 20 times, and the investment value of the sector is highlighted?

In the era of intelligence, making good use of AI has become the core competitiveness of countries, industries, and enterprises. Especially after the birth of large models represented by ChatGPT, the AI ​​industry has been pressed on the accelerator button, and the demand for computing power has exploded.

As one of the computing power infrastructure, the demand for AI servers is expected to benefit from the continuous increase in computing power demand and grow rapidly, and the market value is highlighted.

Since this year, the price of AI servers has been rising and has become the focus of the market. One company revealed that the price of an artificial intelligence server it purchased in June last year increased nearly 20 times in less than a year.

01 AI large model boom is coming

The fundamental reason for the substantial increase in AI server prices is that the explosion of market demand is the key.

In recent years, with the rapid development of artificial intelligence technology, large-scale models such as AIGC have become an inevitable trend. The global investment boom was quickly triggered with the emergence of the phenomenal AI application ChatGPT in November 2022.

Currently, the arms race between major technology giants in large AI models has begun. Major domestic and foreign giants, such as Microsoft, Google, Amazon, etc., have almost all invested in the development of large-scale AI applications.

Domestically, since Baidu took the lead in announcing “Wen Xin Yi Yan” on March 16, Alibaba, 360, SenseTime and other companies have also successively demonstrated the progress of large model projects. Suddenly, the domestic large model field has been turbulent.

The investment boom in AI large models continues to heat up, and the realization of large AI models requires massive data and powerful computing power to support the training and inference process, and the demand for AI computing power will also increase exponentially.

Huawei predicts that by 2030, the demand for computing power due to the outbreak of AI will increase 500 times compared with 2020. The main investment opportunities will focus on servers, optical modules, computing chips, data centers and other hardware fields, bringing huge opportunities.

Among them, the AI ​​server, as an important equipment based on computing power, is expected to usher in rapid development opportunities in the AI ​​era. According to IDC data, the market size of China's AI servers in 2021 will be US$5.7 billion, a year-on-year increase of 61.6%. The market size is expected to grow to US$10.9 billion by 2025, with a CAGR of 17.5%.

02 Core component GPU is “hard to find”

The AI ​​server market with strong demand is experiencing a severe shortage in supply of core components GPU (image processor, acceleration chip), causing prices to continue to soar. Affected by the rising cost of parts and components, the price of AI servers has increased accordingly.

It is reported that the computing power provided by the current general-purpose server CPU (central processing unit) cannot meet the needs of AI applications, while the GPU (image processor, acceleration chip) has real-time high-speed parallel computing and floating-point computing capabilities. It is better at sorting out intensive data operations, such as AI training/inference, machine learning and other application scenarios.

At the same time, traditional servers are usually equipped with up to 4 CPUs corresponding to memory and hard drives. AI servers often need to be equipped with 2 CPUs and 8 GPUs, and some high-end servers even require 16 GPUs. In other words, in AI servers, the demand for GPUs will increase exponentially.

From the perspective of market size, Considering that the unit price of AI servers is more than 20 times higher than that of ordinary servers, the market speculates that as the leakage rate of AI servers increases, the future market potential of GPUs will be huge. According to VerifiedMarketResearch estimates, the global GPU market is expected to reach US$185.3 billion in 2027, and the Chinese market will reach US$34.6 billion in 2027.

On the supply side, Beautiful China continues to curb the development of China’s AI industry and restricts major GPU manufacturers such as NVIDIA (market share as high as 80%) and AMD from selling high-performance GPUs to China.

Domestic upstream GPU shortages are severe, and server manufacturers lack core components, which will naturally affect the production of AI server companies.

In this context, GPU localization is imminent. At present, companies represented by leading communication equipment manufacturing companies such as ZTE are accelerating the deployment of GPU servers.

At the 2022 annual performance briefing, ZTE Executive Director and President Xu Ziyang stated that a ChatGPT GPU server that will support large broadband will be launched by the end of this year.

Xu Ziyang said that there are three main steps. The first is a new generation of computing infrastructure products. It plans to launch a GPU model that supports large-bandwidth ChatGPT by the end of this year to support large model training, including AI servers, high-performance switches, etc.; Second, at the software level, ZTE will put its capabilities into the Digital Nebula solution; third, ZTE will develop its own new generation of AI chips to reduce reasoning costs.

According to many institutions, ZTE is expected to profit from the AI ​​wave set off by ChatGPT. According to the "China Server Market Tracking Report Prelim for the Fourth Quarter of 2022" released by IDC, ZTE's market share has increased from 3.1% to 5.3%, ranking among the top five in the country.

Conclusion:

With the successive launch of large downstream models, the demand for AI computing power is increasing rapidly. From the perspective of industry trends, the process of GPU localization is accelerating. However, there is a big gap between domestic GPU products and overseas leaders such as Nvidia in many aspects, so it is even more difficult to achieve breakthroughs.

For this reason, the market also speculates that AI server prices may continue to increase in the future.

Author: Bottle

The above is the detailed content of With the arrival of large AI models, server prices have soared 20 times, and the investment value of the sector is highlighted?. For more information, please follow other related articles on the PHP Chinese website!

Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

AI Hentai Generator

AI Hentai Generator

Generate AI Hentai for free.

Hot Article

R.E.P.O. Energy Crystals Explained and What They Do (Yellow Crystal)
4 weeks ago By 尊渡假赌尊渡假赌尊渡假赌
R.E.P.O. Best Graphic Settings
4 weeks ago By 尊渡假赌尊渡假赌尊渡假赌
R.E.P.O. How to Fix Audio if You Can't Hear Anyone
4 weeks ago By 尊渡假赌尊渡假赌尊渡假赌
WWE 2K25: How To Unlock Everything In MyRise
1 months ago By 尊渡假赌尊渡假赌尊渡假赌

Hot Tools

Notepad++7.3.1

Notepad++7.3.1

Easy-to-use and free code editor

SublimeText3 Chinese version

SublimeText3 Chinese version

Chinese version, very easy to use

Zend Studio 13.0.1

Zend Studio 13.0.1

Powerful PHP integrated development environment

Dreamweaver CS6

Dreamweaver CS6

Visual web development tools

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

Large AI models are very expensive and only big companies and the super rich can play them successfully Large AI models are very expensive and only big companies and the super rich can play them successfully Apr 15, 2023 pm 07:34 PM

The ChatGPT fire has led to another wave of AI craze. However, the industry generally believes that when AI enters the era of large models, only large companies and super-rich companies can afford AI, because the creation of large AI models is very expensive. The first is that it is computationally expensive. Avi Goldfarb, a marketing professor at the University of Toronto, said: "If you want to start a company, develop a large language model yourself, and calculate it yourself, the cost is too high. OpenAI is very expensive, costing billions of dollars." Rental computing certainly will It's much cheaper, but companies still have to pay expensive fees to AWS and other companies. Secondly, data is expensive. Training models requires massive amounts of data, sometimes the data is readily available and sometimes not. Data like CommonCrawl and LAION can be free

How to build an AI-oriented data governance system? How to build an AI-oriented data governance system? Apr 12, 2024 pm 02:31 PM

In recent years, with the emergence of new technology models, the polishing of the value of application scenarios in various industries and the improvement of product effects due to the accumulation of massive data, artificial intelligence applications have radiated from fields such as consumption and the Internet to traditional industries such as manufacturing, energy, and electricity. The maturity of artificial intelligence technology and application in enterprises in various industries in the main links of economic production activities such as design, procurement, production, management, and sales is constantly improving, accelerating the implementation and coverage of artificial intelligence in all links, and gradually integrating it with the main business , in order to improve industrial status or optimize operating efficiency, and further expand its own advantages. The large-scale implementation of innovative applications of artificial intelligence technology has promoted the vigorous development of the big data intelligence market, and also injected market vitality into the underlying data governance services. With big data, cloud computing and computing

Popular science: What is an AI large model? Popular science: What is an AI large model? Jun 29, 2023 am 08:37 AM

AI large models refer to artificial intelligence models trained using large-scale data and powerful computing power. These models usually have a high degree of accuracy and generalization capabilities and can be applied to various fields such as natural language processing, image recognition, speech recognition, etc. The training of large AI models requires a large amount of data and computing resources, and it is usually necessary to use a distributed computing framework to accelerate the training process. The training process of these models is very complex and requires in-depth research and optimization of data distribution, feature selection, model structure, etc. AI large models have a wide range of applications and can be used in various scenarios, such as smart customer service, smart homes, autonomous driving, etc. In these applications, AI large models can help people complete various tasks more quickly and accurately, and improve work efficiency.

In the era of large AI models, new data storage bases promote the digital intelligence transition of education, scientific research In the era of large AI models, new data storage bases promote the digital intelligence transition of education, scientific research Jul 21, 2023 pm 09:53 PM

Generative AI (AIGC) has opened a new era of generalization of artificial intelligence. The competition around large models has become spectacular. Computing infrastructure is the primary focus of competition, and the awakening of power has increasingly become an industry consensus. In the new era, large models are moving from single-modality to multi-modality, the size of parameters and training data sets is growing exponentially, and massive unstructured data requires the support of high-performance mixed load capabilities; at the same time, data-intensive The new paradigm is gaining popularity, and application scenarios such as supercomputing and high-performance computing (HPC) are moving in depth. Existing data storage bases are no longer able to meet the ever-upgrading needs. If computing power, algorithms, and data are the "troika" driving the development of artificial intelligence, then in the context of huge changes in the external environment, the three urgently need to regain dynamic

Vivo launches self-developed general-purpose AI model - Blue Heart Model Vivo launches self-developed general-purpose AI model - Blue Heart Model Nov 01, 2023 pm 02:37 PM

Vivo released its self-developed general artificial intelligence large model matrix - the Blue Heart Model at the 2023 Developer Conference on November 1. Vivo announced that the Blue Heart Model will launch 5 models with different parameter levels, respectively. It contains three levels of parameters: billion, tens of billions, and hundreds of billions, covering core scenarios, and its model capabilities are in a leading position in the industry. Vivo believes that a good self-developed large model needs to meet the following five requirements: large scale, comprehensive functions, powerful algorithms, safe and reliable, independent evolution, and widely open source. The rewritten content is as follows: Among them, the first is Lanxin Big Model 7B, this is a 7 billion level model designed to provide dual services for mobile phones and the cloud. Vivo said that this model can be used in fields such as language understanding and text creation.

With reference to the human brain, will learning to forget make large AI models better? With reference to the human brain, will learning to forget make large AI models better? Mar 12, 2024 pm 02:43 PM

Recently, a team of computer scientists developed a more flexible and resilient machine learning model with the ability to periodically forget known information, a feature not found in existing large-scale language models. Actual measurements show that in many cases, the "forgetting method" is very efficient in training, and the forgetting model will perform better. Jea Kwon, an AI engineer at the Institute for Basic Science in Korea, said the new research means significant progress in the field of AI. The "forgetting method" training efficiency is very high. Most of the current mainstream AI language engines use artificial neural network technology. Each "neuron" in this network structure is actually a mathematical function. They are connected to each other to receive and transmit information.

AI large models are popular! Technology giants have joined in, and policies in many places have accelerated their implementation. AI large models are popular! Technology giants have joined in, and policies in many places have accelerated their implementation. Jun 11, 2023 pm 03:09 PM

In recent times, artificial intelligence has once again become the focus of human innovation, and the arms competition around AI has become more intense than ever. Not only are technology giants gathering to join the battle of large models for fear of missing out on the new trend, but even Beijing, Shanghai, Shenzhen and other places have also introduced policies and measures to carry out research on large model innovation algorithms and key technologies to create a highland for artificial intelligence innovation. . AI large models are booming, and major technology giants have joined in. Recently, the "China Artificial Intelligence Large Model Map Research Report" released at the 2023 Zhongguancun Forum shows that China's artificial intelligence large models are showing a booming development trend, and there are many companies in the industry. Influential large models. Robin Li, founder, chairman and CEO of Baidu, said bluntly that we are at a new starting point

Lecture Reservation|Five experts discussed: How does AI large model affect the research and development of new drugs under the wave of new technologies? Lecture Reservation|Five experts discussed: How does AI large model affect the research and development of new drugs under the wave of new technologies? Jun 08, 2023 am 11:27 AM

In 1978, Stuart Marson and others from the University of California established the world's first CADD commercial company and pioneered the development of a chemical reaction and database retrieval system. Since then, computer-aided drug design (CADD) has entered an era of rapid development and has become one of the important means for pharmaceutical companies to conduct drug research and development, bringing revolutionary upgrades to this field. On October 5, 1981, Fortune magazine published a cover article titled "The Next Industrial Revolution: Merck Designs Drugs Through Computers," officially announcing the advent of CADD technology. In 1996, the first drug carbonic anhydrase inhibitor developed based on SBDD (structure-based drug design) was successfully launched on the market. CADD was widely used in drug research and development.

See all articles