Home > Technology peripherals > AI > The growing demand for global AI inference power consumption can be met by adding two new nuclear power plants

The growing demand for global AI inference power consumption can be met by adding two new nuclear power plants

王林
Release: 2023-11-14 19:37:14
forward
708 people have browsed it

IT House News on November 14th, Meta’s Generative Artificial Intelligence Engineering Director Sergey Edunov recently shared his predictions on the demand for artificial intelligence reasoning at the Silicon Valley Digital Workers Forum. He believes that next year's new demand for artificial intelligence application reasoning around the world can be met by the power generation of only two nuclear power plants if a language model of reasonable scale is used.

The growing demand for global AI inference power consumption can be met by adding two new nuclear power plants

Image source: Pexels

Artificial intelligence reasoning refers to using already trained artificial intelligence models to perform various tasks in actual scenarios, such as generating text, answering questions, recognizing images, etc. Edunov said he used simple mathematical calculations to estimate the electricity consumption of global inference needs next year. He assumes that there will be 1 million to 2 million new Nvidia H100 graphics processors around the world next year, each with a power of about 1 kilowatt. If each processor runs 24 hours a day, each person can generate 100,000 "tokens" every day. He believes that this power consumption is reasonable on a human scale. The world only needs two new nuclear power plants to provide enough electricity

However, IT House noticed that Edunov also pointed out that the development of artificial intelligence faces some challenges and limitations. One of them is the issue of data volume. Currently, training artificial intelligence models requires a large amount of data, but public Internet data is no longer enough to support the training of next-generation models. The next generation model may require 10 times the amount of data, which means more professional domain data, or more multi-modal data such as video, audio, etc. Another challenge is supply chain issues. Due to tight global chip production capacity, the speed of improvement of artificial intelligence models will also be affected. Therefore, researchers are working to improve model efficiency and reduce dependence on data and hardware. For example, Salesforce has developed a technology called Blib-2 that can automatically adjust model size and dynamically shrink or expand the model according to different tasks and resource requirements

According to the general view of industry experts, language models will bring huge value to enterprises in the next two years. Edunov predicts that within three to four years we will know whether current technology is capable of achieving general artificial intelligence

The above is the detailed content of The growing demand for global AI inference power consumption can be met by adding two new nuclear power plants. For more information, please follow other related articles on the PHP Chinese website!

source:sohu.com
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template