


The future of data centers: the convergence of artificial intelligence and liquid cooling
The rapid rise of generative artificial intelligence (AI) highlights the breakneck pace at which businesses are adopting AI. According to a recent Accenture report, 98% of business leaders say artificial intelligence will play an important role in their strategy over the next three to five years. McKinsey analysts find that nearly 65% of enterprises plan to increase investment in artificial intelligence in the next three years
NVIDIA, AMD and Intel are launching new technologies designed for generative AI and high-performance computing (HPC) Chips, this momentum has just begun. Public cloud providers and emerging chip companies are also competing. IDC analysts predict that global spending on artificial intelligence software, hardware and services will reach $300 billion, exceeding the $154 billion expected this year
However, there are still challenges in scaling artificial intelligence, the most important of which involve Challenges with the data center infrastructure required to support these workloads.
Data centers are becoming more and more "hot"
GPU is the most common chip in artificial intelligence and machine learning, which can accelerate the computing process of artificial intelligence applications. For example, NVIDIA's H100 GPU has 80 billion transistors, so it generates a lot of heat and requires efficient cooling. Traditionally, configurations reaching 10 kilowatts in a single data center rack have been considered high density. But air cooling is still an effective way to cool these servers. Although the Uptime Institute found that few data centers have racks exceeding 30 kilowatts, extreme densities are emerging. The commoditization of high-performance computing and the rise of generative artificial intelligence are increasing power demands and overtaxing traditional air cooling methods.
For example, NVIDIA’s latest GPU’s maximum power consumption is 160% higher than the previous generation chip. Rack configurations can easily exceed the 40kW range, which is difficult to manage with traditional air cooling methods. Today's data centers must continue to evolve to effectively manage these increased heat loads
Cooling Technologies Are Increasingly Important
Fortunately, we have a variety of liquid cooling technologies that can meet this challenge, including Backdoor hot-swap and direct-to-chip technologies are becoming increasingly popular. There are also different types of emerging immersion cooling technology, which essentially involves immersing IT components in a container filled with liquid coolant. Although immersion cooling is still in its early adoption stages, analysts predict that the technology It will become mainstream in the next four years, with the market size growing from US$251 million in 2021 to more than US$1.6 billion in 2027. This will significantly impact data center infrastructure needs, and business leaders must know whether their data center operators are willing to make the necessary investments in the short term to support this shift.
Advantages and Disadvantages of Liquid Cooling
Liquids are 1,000 times more efficient as heat conductors than air and require less infrastructure. Air cooling systems require complex refrigeration equipment, including coolers, air pumps, cables, humidity control and filtration systems, and redundant backup systems to ensure that servers do not lose cooling during a power outage
In contrast, Liquid cooling systems are relatively simple, but implementing them in current data center infrastructure can present significant challenges, including upfront investment and complexity. Setting up a liquid cooling system can be complicated and may require specialized maintenance. Additionally, server designs may need to be adjusted, adopting an immersion approach may void the OEM warranty, and cooling system leaks may cause equipment damage and downtime. Data center operators must also take into account new regulations and environmental standards involved in using liquid cooling systems. That said, liquid or immersion cooling systems do not require as much backup or special floor or aisle sealing strategies. . The overall impact on energy consumption and costs can be significant. Results of a recent study found that implementing liquid cooling can reduce facility power by nearly 20% and total data center power by more than 10%. Total Usage Effectiveness (TUE), a new metric designed to compare the efficiency of liquid cooling to air cooling in high-performance computing environments, shows that liquid cooling improves energy efficiency by more than 15%.
Transitioning to liquid cooling has other sustainable benefits. Liquid cooling systems require less water than air cooling systems. Retrofitting data centers can employ new ways of thinking to shrink their physical and carbon footprints. Thermal reuse strategies can provide energy to surrounding businesses and communities. The possibilities are exciting and could be as transformative as generative AI itself.
What to Know Now
For most enterprises, transitioning to an on-premises data center may be too complex and expensive. On the other hand, much of today’s public cloud infrastructure is not built to run large-scale AI applications, and the rising cost of hosting high-volume workloads in the cloud is prompting many organizations to look for other options
Given these challenges and opportunities, colocation data center providers with infrastructure experience handling myriad customer use cases may provide the best solution for many enterprises. Leaders in this space can provide expertise and support to guide organizations through their transformation. We have also developed key relationships with a number of hardware OEMs and liquid cooling suppliers that will drive data center growth, providing diverse options to meet our customers' unique needs.
Organizations now need to know whether their data center operators are already planning and, perhaps more importantly, have the physical capacity available or the technology needed to fit in to enable the development of next-generation data centers. . Data centers already face the complex challenge of moving workloads to the best servers for their requirements. As the demands of artificial intelligence and high-performance computing workloads continue to increase, these obstacles will certainly be compounded by the additional challenge of adding fundamentally different cooling systems.
Data center operators that are currently investing in these strategies will be well-positioned to help their customers proactively address these challenges. Artificial intelligence is changing everything, including data centers. Now is the time to start this conversation
The above is the detailed content of The future of data centers: the convergence of artificial intelligence and liquid cooling. For more information, please follow other related articles on the PHP Chinese website!

Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

Hot Topics

This site reported on June 27 that Jianying is a video editing software developed by FaceMeng Technology, a subsidiary of ByteDance. It relies on the Douyin platform and basically produces short video content for users of the platform. It is compatible with iOS, Android, and Windows. , MacOS and other operating systems. Jianying officially announced the upgrade of its membership system and launched a new SVIP, which includes a variety of AI black technologies, such as intelligent translation, intelligent highlighting, intelligent packaging, digital human synthesis, etc. In terms of price, the monthly fee for clipping SVIP is 79 yuan, the annual fee is 599 yuan (note on this site: equivalent to 49.9 yuan per month), the continuous monthly subscription is 59 yuan per month, and the continuous annual subscription is 499 yuan per year (equivalent to 41.6 yuan per month) . In addition, the cut official also stated that in order to improve the user experience, those who have subscribed to the original VIP

Improve developer productivity, efficiency, and accuracy by incorporating retrieval-enhanced generation and semantic memory into AI coding assistants. Translated from EnhancingAICodingAssistantswithContextUsingRAGandSEM-RAG, author JanakiramMSV. While basic AI programming assistants are naturally helpful, they often fail to provide the most relevant and correct code suggestions because they rely on a general understanding of the software language and the most common patterns of writing software. The code generated by these coding assistants is suitable for solving the problems they are responsible for solving, but often does not conform to the coding standards, conventions and styles of the individual teams. This often results in suggestions that need to be modified or refined in order for the code to be accepted into the application

To learn more about AIGC, please visit: 51CTOAI.x Community https://www.51cto.com/aigc/Translator|Jingyan Reviewer|Chonglou is different from the traditional question bank that can be seen everywhere on the Internet. These questions It requires thinking outside the box. Large Language Models (LLMs) are increasingly important in the fields of data science, generative artificial intelligence (GenAI), and artificial intelligence. These complex algorithms enhance human skills and drive efficiency and innovation in many industries, becoming the key for companies to remain competitive. LLM has a wide range of applications. It can be used in fields such as natural language processing, text generation, speech recognition and recommendation systems. By learning from large amounts of data, LLM is able to generate text

Large Language Models (LLMs) are trained on huge text databases, where they acquire large amounts of real-world knowledge. This knowledge is embedded into their parameters and can then be used when needed. The knowledge of these models is "reified" at the end of training. At the end of pre-training, the model actually stops learning. Align or fine-tune the model to learn how to leverage this knowledge and respond more naturally to user questions. But sometimes model knowledge is not enough, and although the model can access external content through RAG, it is considered beneficial to adapt the model to new domains through fine-tuning. This fine-tuning is performed using input from human annotators or other LLM creations, where the model encounters additional real-world knowledge and integrates it

Editor |ScienceAI Question Answering (QA) data set plays a vital role in promoting natural language processing (NLP) research. High-quality QA data sets can not only be used to fine-tune models, but also effectively evaluate the capabilities of large language models (LLM), especially the ability to understand and reason about scientific knowledge. Although there are currently many scientific QA data sets covering medicine, chemistry, biology and other fields, these data sets still have some shortcomings. First, the data form is relatively simple, most of which are multiple-choice questions. They are easy to evaluate, but limit the model's answer selection range and cannot fully test the model's ability to answer scientific questions. In contrast, open-ended Q&A

Editor | KX In the field of drug research and development, accurately and effectively predicting the binding affinity of proteins and ligands is crucial for drug screening and optimization. However, current studies do not take into account the important role of molecular surface information in protein-ligand interactions. Based on this, researchers from Xiamen University proposed a novel multi-modal feature extraction (MFE) framework, which for the first time combines information on protein surface, 3D structure and sequence, and uses a cross-attention mechanism to compare different modalities. feature alignment. Experimental results demonstrate that this method achieves state-of-the-art performance in predicting protein-ligand binding affinities. Furthermore, ablation studies demonstrate the effectiveness and necessity of protein surface information and multimodal feature alignment within this framework. Related research begins with "S

According to news from this website on June 18, Samsung Semiconductor recently introduced its next-generation data center-grade solid-state drive BM1743 equipped with its latest QLC flash memory (v7) on its technology blog. ▲Samsung QLC data center-grade solid-state drive BM1743 According to TrendForce in April, in the field of QLC data center-grade solid-state drives, only Samsung and Solidigm, a subsidiary of SK Hynix, had passed the enterprise customer verification at that time. Compared with the previous generation v5QLCV-NAND (note on this site: Samsung v6V-NAND does not have QLC products), Samsung v7QLCV-NAND flash memory has almost doubled the number of stacking layers, and the storage density has also been greatly improved. At the same time, the smoothness of v7QLCV-NAND

Machine learning is an important branch of artificial intelligence that gives computers the ability to learn from data and improve their capabilities without being explicitly programmed. Machine learning has a wide range of applications in various fields, from image recognition and natural language processing to recommendation systems and fraud detection, and it is changing the way we live. There are many different methods and theories in the field of machine learning, among which the five most influential methods are called the "Five Schools of Machine Learning". The five major schools are the symbolic school, the connectionist school, the evolutionary school, the Bayesian school and the analogy school. 1. Symbolism, also known as symbolism, emphasizes the use of symbols for logical reasoning and expression of knowledge. This school of thought believes that learning is a process of reverse deduction, through existing
