Home > Technology peripherals > AI > alert! The energy consumption crisis caused by the explosion of ChatGPT poses huge challenges to data center operators

alert! The energy consumption crisis caused by the explosion of ChatGPT poses huge challenges to data center operators

王林
Release: 2023-04-11 15:58:03
forward
1650 people have browsed it

Recently, ChatGPT, an intelligent chat tool owned by the American company OpenAI, has taken social media by storm, attracting over 10 billion U.S. dollars in investment and driving a huge boom in artificial intelligence applications in the capital market. It has been in the limelight for a while.

alert! The energy consumption crisis caused by the explosion of ChatGPT poses huge challenges to data center operators

Microsoft was the first to announce a $10 billion investment in OpenAI, and then Amazon and the US version of "Today's Headlines" BuzzFeed announced that they would enable ChatGPT in their daily work. , Baidu also announced the launch of the "Chinese version" of the ChatGPT chatbot in March. After many technology companies added fuel to the fire, ChatGPT instantly attracted global attention.

Data shows that the number of robots deployed by Amazon is increasing rapidly, with the daily increase reaching about 1,000. In addition, Facebook parent company Meta also plans to invest an additional US$4 billion to US$5 billion in data centers in 2023, all of which is expected to be used for artificial intelligence. IBM CEO Krishna said that artificial intelligence is expected to contribute $16 trillion to the global economy by 2030.

With the popularity of ChatGPT, giants may start a new round of fierce battle in the field of artificial intelligence in 2023.

However, when ChatGPT-3 predicts the next word, it needs to perform multiple inference calculations, so it takes up a lot of resources and consumes more power. As data center infrastructure expands to support the explosive growth of cloud computing, video streaming and 5G networks, its GPU and CPU architecture cannot operate efficiently to meet the imminent computing needs, which creates problems for hyperscale data center operators. huge challenge.

GPT3.5 training uses an AI computing system specially built by Microsoft, a high-performance network cluster composed of 10,000 V100 GPUs, with a total computing power consumption of approximately 3640 PF-days (i.e. If it is calculated one quadrillion times per second, it takes 3640 days to calculate). Such a large-scale and long-term GPU cluster training task places extreme requirements on the performance, reliability, cost and other aspects of the network interconnection base.

For example, Meta announced that it would suspend the expansion of data centers around the world and reconfigure these server farms to meet the data processing needs of artificial intelligence.

The demand for data processing in artificial intelligence platforms is huge. The OpenAI creators of ChatGPT launched the platform in November last year. It will not be able to continue without a ride on Microsoft’s upcoming upgrade of the Azure cloud platform. run.

The data center infrastructure that supports this digital transformation will be organized like the human brain into two hemispheres, or lobes, with one lobe needing to be much stronger than the other. One hemisphere will serve what's called "training," the computing power needed to process up to 300B data points to create the word salad that ChatGPT generates.

Training flaps require powerful computing power and state-of-the-art GPU semiconductors, but are currently required in data center clusters supporting cloud computing services and 5G networks There is very little connectivity.

At the same time, the infrastructure focused on “training” each AI platform will create a huge demand for electricity, requiring data centers to be located near gigawatts of renewable energy, installed A new liquid cooling system, along with redesigned backup power and generator systems, among other new design features.

Artificial Intelligence Platforms In the other hemisphere of the brain, higher-functioning digital infrastructure known as “inference” mode supports interactive “generative” platforms that respond to input questions or Within seconds of the instruction, the query is processed, entered into the modeled database, and responded with convincing human syntax.

While today’s hyper-connected data center networks, such as the largest data center cluster in North America, Northern Virginia’s “data center” also has the most extensive fiber optic network that can accommodate the lower reaches of the artificial intelligence brain’s “inference” lobe Level 1 connectivity is required, but these facilities will also need to be upgraded to meet the huge processing capacity required, and they will need to be closer to the substations.

In addition, data from research institutions show that data centers have become the world's largest energy consumers, and their proportion of total electricity consumption will increase from 3% in 2017 to 4.5% in 2025. Taking China as an example, the electricity consumption of data centers operating nationwide is expected to exceed 400 billion kWh in 2030, accounting for 4% of the country's total electricity consumption.

Therefore, even digital products require energy to develop and consume, and ChatGPT is no exception. It is estimated that inference processing in machine learning work accounts for 80-90% of computing power consumption. A rough calculation, since ChatGPT Since going online on November 30, 2022, carbon emissions have exceeded 814.61 tons.

According to calculations by professional organizations, assuming that ChatGPT hosted on Microsoft's Azure cloud has 1 million user inquiries per day (approximately 29,167 hours per day under a specific response time and vocabulary), Calculated based on the maximum power of the A100 GPU of 407W (watts), the daily carbon emissions reach 3.82 tons, and the monthly carbon emissions exceed 100 tons. Today, ChatGPT has more than 10 million daily users, and the actual carbon emissions are far more than 100 tons per month. In addition, training such a large language model containing 175 billion parameters requires tens of thousands of CPUs/GPUs to input data 24 hours a day, consumes approximately 1287MWh of electricity, and emits more than 552 tons of carbon dioxide.

alert! The energy consumption crisis caused by the explosion of ChatGPT poses huge challenges to data center operators

Judging from the carbon emissions of these large language models, GPT-3, the predecessor of ChatGPT, has the largest carbon emissions. It is reported that Americans produce an average of 16.4 tons of carbon emissions every year, and Danes produce an average of 11 tons of carbon emissions every year. As a result, ChatGPT’s model is trained to emit more carbon emissions than 50 Danes emit per year.

Cloud computing providers also recognize that data centers use large amounts of electricity and have taken steps to improve efficiency, such as building and operating data centers in the Arctic to take advantage of renewable energy and natural cooling conditions. However, this is not enough to meet the explosive growth of AI applications.

Lawrence Berkeley National Laboratory in the United States found in research that improvements in data center efficiency have been controlling the growth of energy consumption over the past 20 years, but research shows that current energy efficiency measures may not be enough to meet the needs of future data centers. needs.

The artificial intelligence industry is now at a critical turning point. Technological advances in generative AI, image recognition, and data analytics are revealing unique connections and uses for machine learning, but a technology solution that can meet this need first needs to be built because, according to Gartner, unless more sustainable choice, otherwise by 2025, AI will consume more energy than human activities. ​

The above is the detailed content of alert! The energy consumption crisis caused by the explosion of ChatGPT poses huge challenges to data center operators. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:51cto.com
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template