# March 10 news, AI has once again become a hot topic in the technology industry, and it is expected to revolutionize industries worth trillions of dollars, from retail to medicine. industry. But creating each new chatbot or image generator requires a huge amount of electricity, which means the technology could release large amounts of greenhouse gases, exacerbating global warming.
Microsoft, Google and ChatGPT maker OpenAI all use cloud computing, which relies on thousands of chips in massive data center servers around the world to train AI algorithms called models, analyze data and Help these algorithms "learn" how to perform tasks. The success of ChatGPT has prompted other companies to race to launch their own AI systems and chatbots, or develop products that use large AI models.
AI uses more energy than other forms of computing, with training a single model consuming more electricity than more than 100 U.S. households use in 1 year. However, although the AI industry is growing very fast, it is not transparent enough that no one knows exactly the total power consumption and carbon emissions of AI. Carbon emissions can also vary widely, depending on the type of power plant providing the electricity. Data centers powered by coal or natural gas will have significantly higher carbon emissions than those powered by solar or wind power.
While the researchers have tallied the carbon emissions created by creating a single model, and some companies have provided data on their energy use, they have not provided an overall estimate of the technology's total electricity usage. Sasha Luccioni, a researcher at AI company Huging Face, wrote a paper quantifying the carbon emissions of her company Bloom, a competitor to the OpenAI model GPT-3. Lucioni also attempted to assess the carbon emissions of the OpenAI chatbot ChatGPT based on a limited set of public data.
Improving Transparency
Researchers such as Lucioni say there is a need for greater transparency when it comes to the power use and emissions of AI models. Armed with this information, governments and companies may decide whether it's worth using GPT-3 or other large-scale models to study cancer treatments or protect indigenous languages.
Greater transparency may also bring more scrutiny, and the cryptocurrency industry may provide lessons from the past. Bitcoin has been criticized for its excessive power consumption, consuming as much electricity annually as Argentina, according to the Cambridge Bitcoin Electricity Consumption Index. This voracious demand for electricity prompted New York state to pass a two-year moratorium on issuing licenses to cryptocurrency miners powered by fossil fuel-generated electricity.
GPT-3 is a general-purpose AI program with a single function that can generate language and has many different uses. A research paper published in 2021 showed that training GPT-3 consumed 1.287 gigawatt hours of electricity, which is approximately equivalent to the electricity consumption of 120 American households for one year. At the same time, such training produced 502 tons of carbon, equivalent to the annual emissions of 110 American cars. Moreover, this kind of training only applies to one program, or "model."
Although the upfront electricity cost of training an AI model is huge, researchers have found that in some cases this is only about 40% of the electricity consumed by the model in actual use. In addition, AI models are getting larger and larger. OpenAI’s GPT-3 uses 175 billion parameters, or variables, while its predecessor used only 1.5 billion parameters.
OpenAI is already working on GPT-4, and the model must be retrained regularly to maintain its understanding of current events. Emma Strubell, a professor at Carnegie Mellon University who was one of the first researchers to study the energy problem in AI, said: "If you don't retrain the model, it may not even know what COVID-19."
Another relative measure comes from Google, where researchers found that AI training accounts for 10% to 15% of the company's total electricity use, which in 2021 was 18.3 terawatt hours. This means that Google’s AI consumes 2.3 terawatt hours of electricity per year, which is roughly equivalent to the annual electricity consumption of all households in Atlanta.
Tech giants make net-zero pledges
While in many cases AI models are getting bigger, AI companies are also constantly improving, making them Run more efficiently. The largest U.S. cloud computing companies, including Microsoft, Google and Amazon, have all made carbon reduction or net-zero commitments. Google said in a statement that it will achieve net-zero emissions across all operations by 2030, with the goal of running its offices and data centers entirely on carbon-free energy. Google is also using AI to improve the energy efficiency of its data centers, with the technology directly controlling cooling systems in facilities.
OpenAI also cited the work the company has done to improve the efficiency of the ChatGPT application programming interface, helping customers reduce power usage and prices. An OpenAI spokesperson said: “We take our responsibility to stop and reverse climate change very seriously and we put a lot of thought into how to maximize our computing power. OpenAI runs on Azure and we work closely with Microsoft teams to Increase efficiency in running large language models and reduce carbon emissions.”
Microsoft noted that the company is purchasing renewable energy and taking other steps to achieve its previously announced goal of achieving net-zero emissions by 2030. Microsoft said in a statement: "As part of our commitment to creating a more sustainable future, Microsoft is investing in research to measure the energy use and carbon emissions impact of AI, while working to improve the efficiency of large-scale systems in training and application. Roy Schwartz, a professor at the Hebrew University of Jerusalem, worked with a team at Microsoft to measure the carbon footprint of a large AI model. He said: "Obviously, these companies are unwilling to disclose what model they are using and how much carbon it emits."
There are ways to make AI run more efficiently. Ben Hertz-Shargel of energy consultancy Wood Mackenzie said that because AI training can be done at any time, developers or data centers can schedule training for times when electricity is cheaper or surplus, allowing their Operations are more environmentally friendly. AI companies train their models on times of excess power, which they can then use as a selling point in their marketing to show they are environmentally conscious.
Chips consume a staggering amount of power to runMost data centers use graphics processing units (GPUs) to train AI models, and these components are the most power-hungry components made by the chip industry One of the components. A report released by Morgan Stanley analysts earlier this month said that large models require tens of thousands of GPUs and training cycles can range from weeks to months.
One of the bigger mysteries in AI is the total carbon footprint associated with the chips used. Nvidia, the largest GPU maker, says that when it comes to AI tasks, its chips can complete tasks faster and be more efficient overall.
Nvidia said in the statement: "Using GPUs to accelerate AI is faster and more efficient than using CPUs. For some AI workloads, energy efficiency can often be improved by 20 times. For generation Large-scale language models that are essential for modern artificial intelligence can increase energy efficiency by 300 times."
Lucioni said that although Nvidia has disclosed energy-related direct emissions and indirect emissions data, the company No further details were disclosed. She believes that when Nvidia shares this information, we may find that GPUs consume about the same amount of power as a small country, "which may drive people crazy"! (Xiao Xiao)
The above is the detailed content of US media attention: How much electricity does it take to train ChatGPT?. For more information, please follow other related articles on the PHP Chinese website!