


Bridging the Gap: Transforming the Data Center for the Artificial Intelligence Era
Modern data centers, regardless of size, need to rethink power management and backup strategies, which is a vital part of the innovation strategy.
The era of artificial intelligence will completely change the status quo of data centers. Enterprises of all types are actively exploring how to utilize generative AI technology. This requires them to have more advanced, more secure, and more efficient data center facilities.
Hyperscale is the ideal user group for modern data centers. They have the resources and capabilities to find new opportunities and leverage the most advanced technologies to build new infrastructure.
Still, businesses don’t have to limit themselves to meeting fewer needs. Smaller data centers can be transformed for the AI era by incorporating the right technology to optimize real estate use. This requires careful consideration of aspects such as the computing infrastructure that powers AI applications, new approaches to rack configuration, cooling technologies, and data storage.
This also means strategically considering data center power backup systems to ensure a balanced power strategy for brownfield retrofits. Every piece of data needs backup power, but your existing power equipment is likely taking up space and not adding a penny in revenue. New technology innovations like nickel-zinc (NiZn) batteries provide higher density backup power, potentially increasing backup capacity while freeing up valuable floor space for increased productivity.
Centralized or Distributed Backup Power Supply
To understand the importance of changes in scale, take a look at the following data. McKinsey predicts that data center demand will grow by 10% annually by 2030. By then, demand from the U.S. market alone will reach 35 GW.
The current situation shows that the demand of data center customers has exceeded the carrying capacity of the data center. For large enterprises that are building new data centers or upgrading, increasing density is a solution to provide more computing power per square foot. So it's no surprise that even major cloud service providers are starting to focus on the amount of space taken up by backup power systems.
Normally, data centers are equipped with centralized uninterruptible power supply (UPS) backup systems. In large-scale applications, people are increasingly turning to distributed backup systems, such as server rack battery backup units (BBUs).
Nonprofit organizations such as the Open Compute Project are pushing for new standards using distributed backup power methods. While this approach offers several advantages in hyperscale enterprises, it is not the best choice for colocation facilities or enterprises. This is because the colocation facility would need to accommodate a variety of different tenant configurations, making it impractical to implement. At the same time, for enterprise-class workloads, a decentralized approach may be overkill.
There is also a backup power supply within the server to ensure that the server shuts down normally in the event of a power outage.
These backup systems may or may not complement each other. The key is finding the right mix to ensure power-hungry AI workloads can continue to run. Many modern data center retrofits involve modular infrastructure, giving existing facilities the flexibility to add needed equipment in an iterative manner and with limited space.
Stay away from lead acid
Unfortunately, the lead-acid batteries that have powered data centers for decades are inefficient and take up valuable space. They also have a limited operating temperature range and require more space for cooling technology.
Lead-acid batteries are relatively cheap to start with, but more modern battery technology is worth the investment. Lithium-ion batteries have been on the market for less than a decade, but they have already captured a sizable share of the market in new data center construction. They are more efficient, so take up less floor space, and do not need to be replaced as frequently as lead-acid batteries.
Nickel-zinc battery technology is not as unstable as lead-acid and lithium-ion batteries. In fact, it has no thermal runaway and can operate over a wider temperature range than any competing battery chemistry. Lithium-ion batteries have high energy density, while nickel-zinc batteries have high power density, which means it has a higher power discharge rate. In a backup scenario, when the only goal is to keep the battery running for 15 to 5 minutes or less, you want a small battery that can discharge a lot of power quickly.
Compatibility with legacy equipment
While hyperscalers can start from scratch, enterprises cannot ignore the existing equipment in the data center. Before the introduction of lithium-ion batteries, every data center used lead-acid batteries.
Utilizing the same UPS charging system, data center operators can more easily retrofit nickel-zinc batteries with existing UPS equipment through direct replacement.
At the same time, because lithium batteries require additional protection, it may be easier to replace lead-acid batteries with nickel-zinc batteries than to buy new lithium batteries. Lithium's volatile chemistry creates additional costs in terms of ventilation, high-capacity fire suppression, enhanced indoor combustion ratings, and other safety features not required for nickel-zinc batteries.
The bottom line is that all businesses, regardless of size, need to modernize their data center strategies to keep up with the promise of artificial intelligence. The opportunity to simply build a new data center won't always exist, but the right retrofit strategy will give businesses the impetus for change they need.
The above is the detailed content of Bridging the Gap: Transforming the Data Center for the Artificial Intelligence Era. For more information, please follow other related articles on the PHP Chinese website!

Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

Hot Topics



This site reported on June 27 that Jianying is a video editing software developed by FaceMeng Technology, a subsidiary of ByteDance. It relies on the Douyin platform and basically produces short video content for users of the platform. It is compatible with iOS, Android, and Windows. , MacOS and other operating systems. Jianying officially announced the upgrade of its membership system and launched a new SVIP, which includes a variety of AI black technologies, such as intelligent translation, intelligent highlighting, intelligent packaging, digital human synthesis, etc. In terms of price, the monthly fee for clipping SVIP is 79 yuan, the annual fee is 599 yuan (note on this site: equivalent to 49.9 yuan per month), the continuous monthly subscription is 59 yuan per month, and the continuous annual subscription is 499 yuan per year (equivalent to 41.6 yuan per month) . In addition, the cut official also stated that in order to improve the user experience, those who have subscribed to the original VIP

Improve developer productivity, efficiency, and accuracy by incorporating retrieval-enhanced generation and semantic memory into AI coding assistants. Translated from EnhancingAICodingAssistantswithContextUsingRAGandSEM-RAG, author JanakiramMSV. While basic AI programming assistants are naturally helpful, they often fail to provide the most relevant and correct code suggestions because they rely on a general understanding of the software language and the most common patterns of writing software. The code generated by these coding assistants is suitable for solving the problems they are responsible for solving, but often does not conform to the coding standards, conventions and styles of the individual teams. This often results in suggestions that need to be modified or refined in order for the code to be accepted into the application

Large Language Models (LLMs) are trained on huge text databases, where they acquire large amounts of real-world knowledge. This knowledge is embedded into their parameters and can then be used when needed. The knowledge of these models is "reified" at the end of training. At the end of pre-training, the model actually stops learning. Align or fine-tune the model to learn how to leverage this knowledge and respond more naturally to user questions. But sometimes model knowledge is not enough, and although the model can access external content through RAG, it is considered beneficial to adapt the model to new domains through fine-tuning. This fine-tuning is performed using input from human annotators or other LLM creations, where the model encounters additional real-world knowledge and integrates it

To learn more about AIGC, please visit: 51CTOAI.x Community https://www.51cto.com/aigc/Translator|Jingyan Reviewer|Chonglou is different from the traditional question bank that can be seen everywhere on the Internet. These questions It requires thinking outside the box. Large Language Models (LLMs) are increasingly important in the fields of data science, generative artificial intelligence (GenAI), and artificial intelligence. These complex algorithms enhance human skills and drive efficiency and innovation in many industries, becoming the key for companies to remain competitive. LLM has a wide range of applications. It can be used in fields such as natural language processing, text generation, speech recognition and recommendation systems. By learning from large amounts of data, LLM is able to generate text

According to news from this website on June 18, Samsung Semiconductor recently introduced its next-generation data center-grade solid-state drive BM1743 equipped with its latest QLC flash memory (v7) on its technology blog. ▲Samsung QLC data center-grade solid-state drive BM1743 According to TrendForce in April, in the field of QLC data center-grade solid-state drives, only Samsung and Solidigm, a subsidiary of SK Hynix, had passed the enterprise customer verification at that time. Compared with the previous generation v5QLCV-NAND (note on this site: Samsung v6V-NAND does not have QLC products), Samsung v7QLCV-NAND flash memory has almost doubled the number of stacking layers, and the storage density has also been greatly improved. At the same time, the smoothness of v7QLCV-NAND

Editor |ScienceAI Question Answering (QA) data set plays a vital role in promoting natural language processing (NLP) research. High-quality QA data sets can not only be used to fine-tune models, but also effectively evaluate the capabilities of large language models (LLM), especially the ability to understand and reason about scientific knowledge. Although there are currently many scientific QA data sets covering medicine, chemistry, biology and other fields, these data sets still have some shortcomings. First, the data form is relatively simple, most of which are multiple-choice questions. They are easy to evaluate, but limit the model's answer selection range and cannot fully test the model's ability to answer scientific questions. In contrast, open-ended Q&A

Machine learning is an important branch of artificial intelligence that gives computers the ability to learn from data and improve their capabilities without being explicitly programmed. Machine learning has a wide range of applications in various fields, from image recognition and natural language processing to recommendation systems and fraud detection, and it is changing the way we live. There are many different methods and theories in the field of machine learning, among which the five most influential methods are called the "Five Schools of Machine Learning". The five major schools are the symbolic school, the connectionist school, the evolutionary school, the Bayesian school and the analogy school. 1. Symbolism, also known as symbolism, emphasizes the use of symbols for logical reasoning and expression of knowledge. This school of thought believes that learning is a process of reverse deduction, through existing

Editor | KX In the field of drug research and development, accurately and effectively predicting the binding affinity of proteins and ligands is crucial for drug screening and optimization. However, current studies do not take into account the important role of molecular surface information in protein-ligand interactions. Based on this, researchers from Xiamen University proposed a novel multi-modal feature extraction (MFE) framework, which for the first time combines information on protein surface, 3D structure and sequence, and uses a cross-attention mechanism to compare different modalities. feature alignment. Experimental results demonstrate that this method achieves state-of-the-art performance in predicting protein-ligand binding affinities. Furthermore, ablation studies demonstrate the effectiveness and necessity of protein surface information and multimodal feature alignment within this framework. Related research begins with "S
