


Global IT giants are using AI to help data centers save energy and reduce emissions.
Data centers power the applications, websites and services used by billions of people around the world every day, and they can be dangerous places for the workers who build and maintain them. Workers must sometimes service electrical equipment in the data center while its power is on. They may be exposed to chemicals such as chlorine, which is used as a disinfectant in water circulated through the liquid cooling systems of computers and servers. In June 2015, five people had to be taken to the hospital after a chlorine leak occurred at an Apple data center in Maiden, North Carolina.
Now, data centers are more secure than ever. But in search of forward-looking solutions, some tech giants say they are exploring how to apply artificial intelligence to prevent security issues. For example, Microsoft is developing an artificial intelligence system that can analyze data from various sources and generate alerts for data center construction and operations teams to "prevent or mitigate the impact of security incidents." A complementary but related system is also being developed in an attempt to detect and predict impacts on data center construction plans.
"These initiatives are in early testing and are expected to begin scaling to our production environments later this year," a Microsoft spokesperson told the outlet via email.
Meta also claims to be studying how artificial intelligence can predict how its data centers will operate under "extreme environmental conditions" that could lead to unsafe working conditions. The company said it has been developing physical models to simulate extreme conditions and feeding that data into AI models responsible for optimizing server power consumption, cooling and airflow.
Meta spokesperson told the media: "Our data centers hold a large amount of operational data, and in some areas there is a high frequency of sensors built into servers, racks and data halls." "Every server and network device Taking on different workloads will consume different power, generate different heat, and create different amounts of airflow in the data center. Our [infrastructure] team collects all the data from each server and then develops an AI model that can Allocate our servers and racks in data centers and send workloads to those servers to optimize [for] performance and efficiency."
Of course, besides security, businesses have other motivations to secure data The center is kept in top condition. Power outages are expensive—and increasingly frequent. One-third of data center owners and operators admitted to experiencing a major outage in the past 12 months, according to a 2020 survey by IT consulting firm IT Uptime Institute. One in six people claimed their outages cost more than $1 million, up from one in 10 in 2019.
Meta, which operates more than 20 data centers around the world, including new projects in Texas and Missouri, said it will build 50 to 100 new data centers per year for the foreseeable future.
AI also promises cost savings by finding energy-saving opportunities in data centers that are often unknown, another attractive aspect for businesses. In 2018, Google claimed that artificial intelligence systems developed by its DeepMind affiliate were able to save an average of 30% energy compared to the historical energy use of its data centers.
When contacted for comment, DeepMind said it had no updates to share beyond the initial announcement. IBM and Amazon did not respond to inquiries. But both Meta and Microsoft say they are now using AI to make similar power adjustments.
Microsoft launched an artificial intelligence "anomaly detection approach" in late 2021 that uses telemetry data from electrical and mechanical equipment to measure and mitigate abnormal power and water events within data centers. The company also uses AI-based methods to identify and fix issues with data center electrical meters and determine ideal locations to place servers to minimize wasted power, network and cooling capacity.
Meta says it has been using reinforcement learning to reduce the amount of air pumped into data centers for cooling purposes. At a high level, reinforcement learning is an artificial intelligence system that learns to solve problems through trial and error. Most data centers use outdoor air and evaporative cooling systems, so optimizing airflow is a top priority.
Reducing the environmental footprint is an added benefit of the energy regulating AI system. According to a report by the Environmental Investigation Agency, data centers consumed approximately 1% of global electricity demand and contributed 0.3% of all carbon dioxide emissions in 2020. A typical data center uses 3 million to 5 million gallons of water per day, equivalent to the water consumption of a city of 30,000 to 50,000 people.
Microsoft has previously said it plans to run all of its data centers on 100% renewable energy by 2025. Meta claims to have achieved this feat in 2020.
The above is the detailed content of Global IT giants are using AI to help data centers save energy and reduce emissions.. For more information, please follow other related articles on the PHP Chinese website!

Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

Hot Topics



According to news on April 15, 2024, a 2024 CIO and technology executive survey recently released by Gartner shows that more than 60% of Chinese companies plan to deploy generative artificial intelligence (GenAI) in the next 12 to 24 months. Since Chinese companies tend to deploy GenAI locally rather than through the public cloud, the current infrastructure environment cannot support GenAI projects. This will promote the design transformation of Chinese enterprise data centers. Zhang Lukeng, research director at Gartner, said: "Due to security and data privacy concerns and regulatory requirements, some enterprises prefer to deploy GenAl solutions or fine-tune large language models (LLM) on-premises. Deploying GenAl on-premises is important for data centers not just

According to news from this website on June 18, Samsung Semiconductor recently introduced its next-generation data center-grade solid-state drive BM1743 equipped with its latest QLC flash memory (v7) on its technology blog. ▲Samsung QLC data center-grade solid-state drive BM1743 According to TrendForce in April, in the field of QLC data center-grade solid-state drives, only Samsung and Solidigm, a subsidiary of SK Hynix, had passed the enterprise customer verification at that time. Compared with the previous generation v5QLCV-NAND (note on this site: Samsung v6V-NAND does not have QLC products), Samsung v7QLCV-NAND flash memory has almost doubled the number of stacking layers, and the storage density has also been greatly improved. At the same time, the smoothness of v7QLCV-NAND

News from this site on January 19. According to official news from Inspur Server, on January 18, Inspur Information and Intel jointly released the world’s first fully liquid-cooled cold plate server reference design and opened it to the industry to promote fully liquid-cooled cold plate solutions. Large-scale deployment of applications in global data centers. Based on this reference design, Inspur Information launched a fully liquid-cooled cold plate server, claiming to achieve nearly 100% liquid cooling of server components and achieve a PUE value close to 1. Note from this site: PUE is the abbreviation of Power Usage Effectiveness. The calculation formula is "total data center energy consumption/IT equipment energy consumption". The total data center energy consumption includes IT equipment energy consumption and energy consumption of cooling, power distribution and other systems. The higher the PUE Close to 1 represents non-IT equipment consumption

The rapid rise of generative artificial intelligence (AI) highlights the breakneck pace with which enterprises are adopting AI. According to a recent Accenture report, 98% of business leaders say artificial intelligence will play an important role in their strategy over the next three to five years. McKinsey analysts found that nearly 65% of enterprises plan to increase investment in artificial intelligence in the next three years. NVIDIA, AMD and Intel are launching new chips designed for generative artificial intelligence and high-performance computing (HPC). The momentum is only just started. Public cloud providers and emerging chip companies are also competing. IDC analysts predict that global spending on artificial intelligence software, hardware and services will reach $300 billion, exceeding this year’s forecast of $154 billion.

With the rapid development of the Internet, the number of website visits is also growing. To meet this demand, we need to build a highly available system. A distributed data center is such a system that distributes the load of each data center to different servers to increase the stability and scalability of the system. In PHP development, we can also implement distributed data centers through some technologies. Distributed cache Distributed cache is one of the most commonly used technologies in Internet distributed applications. It caches data on multiple nodes to improve data access speed and

As demand for data processing and storage continues to surge, data centers are grappling with the challenges of ever-evolving and scaling. Continuous changes in platforms, device designs, topologies, power density requirements and cooling needs have emphasized the urgent need for new structural designs. Data center infrastructure often struggles to align current and projected IT loads with their critical infrastructure, resulting in mismatches that threaten their ability to meet escalating demands. Against this backdrop, traditional data center approaches must be modified. Data centers are now integrating artificial intelligence (AI) and machine learning (ML) technologies into their infrastructure to stay competitive. By implementing an AI-driven layer into traditional data center architecture, enterprises can create autonomous data centers without the need for human labor

Colocation data centers are typically designed to accommodate dozens or even hundreds of customers' diverse applications. However, Nvidia offers a unique data center model that is dedicated to running specific applications for a single customer. The emergence of "artificial intelligence factory" This new type of data center is different from traditional data centers. It focuses on providing more efficient and flexible infrastructure services. Traditional data centers often host multiple applications and multiple tenants, while new data centers focus more on dynamic allocation and optimization of resources to meet the needs of different applications and tenants. The design of this new data center is more flexible and intelligent, and can adjust resource allocation in real time according to demand, improving overall efficiency and performance. Through this innovative design concept, these new data centers mainly use

Recently, some friends have asked the editor how Migu Video can enter the data center. The following will bring you the method of Migu Video entering the data center. Friends who need it can come and learn more. 1. Open the Migu Video APP and click My in the lower right corner of the homepage (as shown in the picture). 2. Click Data Center (as shown in the picture). 3. You can view all the data (as shown in the figure).
