Table of Contents
How artificial intelligence is changing data center infrastructure
To balance the needs of first movers and offset risks, operators need to design their data centers to be artificially The era of intelligent computing offers maximum efficiency and flexibility. This requires a new, holistic approach to design.
So, to ensure uptime over the life of a site and reduce the risk of costly problems, teams need to be more thorough in the data center planning phase.
Solving Infrastructure Challenges
Home Technology peripherals AI How artificial intelligence is changing data center design

How artificial intelligence is changing data center design

Oct 10, 2023 pm 03:05 PM
AI data center

How artificial intelligence is changing data center design

With global spending on AI systems set to double between 2023 and 2026, it’s clear that data center capacity will increase rapidly to meet demand.

However, surprisingly many data center operators have hit the brakes on new projects and slowed investment over the past year, with vacant capacity in London falling by 6.3% in 2022-23.

What’s behind this counter-intuitive trend? To explain this, we need to understand some of the issues related to artificial intelligence computing and the infrastructure that supports it

How artificial intelligence is changing data center infrastructure

Data centers have always been built around The CPU power supply is expanded to handle traditional computing workloads. However, AI computing requires a GPU-driven rack, which consumes more power, dissipates more heat, and takes up more space than a CPU of the same capacity. In practice, this means AI computing power often requires more power connections or alternative cooling systems. Can be rephrased as: What this essentially means is that AI computing power typically requires more power connections or alternative cooling systems. Since this is embedded infrastructure, it is built into the data center complex. structures, replacement costs are often extremely high, if not completely economically impossible.

In practice, operators must commit to how much space in new data centers is dedicated to AI versus traditional computing

If you make this mistake and overcommit to AI, Potentially leaving data center operators with permanently underutilized and unprofitable capacity

This problem is exacerbated by the fact that the AI ​​market is still in its infancy, with Gartner claiming it is currently in a hype cycle where expectations are too high of the peak. As a result, many operators are choosing to hesitate in the design phase rather than commit prematurely to the proportion of AI computing in new data center projects.

Take a holistic approach during the design phase

However, operators are acutely aware that if they do not take the risk of delaying investment, they will lose market share and competitive advantage. However, this is a tall order given that many of the fundamentals of data center infrastructure are being rewritten in real time.

To balance the needs of first movers and offset risks, operators need to design their data centers to be artificially The era of intelligent computing offers maximum efficiency and flexibility. This requires a new, holistic approach to design.

1. Involve more stakeholders

No matter how operators decide to differentiate between artificial intelligence and traditional computing, data center sites with artificial intelligence computing capabilities will be much more complex than traditional facilities . Higher complexity often means more points of failure, especially since AI computing has more demands than traditional computing.

So, to ensure uptime over the life of a site and reduce the risk of costly problems, teams need to be more thorough in the data center planning phase.

In particular, the design phase should seek input from the wider team and expertise at the beginning of the project. In addition to seeking expertise in power and cooling, designers should work early with operations, cabling, and security teams to understand potential sources of bottlenecks or failures

2. Integrating Artificial Intelligence into Data Center Operations

Since operators now have AI computing in the field, they should leverage their ability to leverage AI to improve operational efficiency. Artificial intelligence has a long history of adoption in data centers, with the technology able to execute workflows with extremely high precision and quality. For example, artificial intelligence can help with:

Temperature and humidity monitoring

Safety system operations
  • Power usage monitoring and distribution
  • Hardware fault detection and prediction By proactively using this technology at every stage of the data center lifecycle, operators can significantly improve operational efficiency and robustness. Artificial intelligence is well-suited to help solve new challenges encountered in adopting the novel and complex layout of these next-generation data centers, such as avoiding false economics through fault detection and predictive maintenance
  • 3 Artificial intelligence puts a greater load on data centers during peak hours, such as during training runs or when running enterprise-grade models in production. At this time, AI computing will often far exceed the traditionally expected limits on power consumption, cooling requirements, and data throughput. At the most basic level, this means that the underlying materials of the data center are subjected to more A lot of pressure. If these underlying materials or components are not of high quality, it means they are more susceptible to failure. Since AI computing means a dramatic increase in the number of components and connections in a site, this means that cheaper and lower quality materials that work well in traditional sites may cause the data center running AI computing to grind to a halt
  • To avoid false economic risks of saving money, operators should avoid purchasing inferior materials such as substandard cables. These materials are prone to failure and require frequent replacement. More seriously, failure of substandard materials and components often results in plant shutdowns or slowdowns, impacting profitability
  • Solving Infrastructure Challenges

    Although the infrastructure requirements for AI computing may be the main reason why operators delay investment, in the long run, this will not be the case

    As the market As uncertainty increases, companies will be split between traditional computing and artificial intelligence computing, moving closer to their "golden fields"

    In this case, companies should ensure that they have everything in place in website operations Possible advantages in order to succeed as they learn and grow

    This means designing holistically from the beginning, leveraging AI itself to discover new efficiencies in their sites, and investing in investments that can meet the needs of AI Calculate greater demand for quality materials.

The above is the detailed content of How artificial intelligence is changing data center design. For more information, please follow other related articles on the PHP Chinese website!

Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Tools

Notepad++7.3.1

Notepad++7.3.1

Easy-to-use and free code editor

SublimeText3 Chinese version

SublimeText3 Chinese version

Chinese version, very easy to use

Zend Studio 13.0.1

Zend Studio 13.0.1

Powerful PHP integrated development environment

Dreamweaver CS6

Dreamweaver CS6

Visual web development tools

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

Bytedance Cutting launches SVIP super membership: 499 yuan for continuous annual subscription, providing a variety of AI functions Bytedance Cutting launches SVIP super membership: 499 yuan for continuous annual subscription, providing a variety of AI functions Jun 28, 2024 am 03:51 AM

This site reported on June 27 that Jianying is a video editing software developed by FaceMeng Technology, a subsidiary of ByteDance. It relies on the Douyin platform and basically produces short video content for users of the platform. It is compatible with iOS, Android, and Windows. , MacOS and other operating systems. Jianying officially announced the upgrade of its membership system and launched a new SVIP, which includes a variety of AI black technologies, such as intelligent translation, intelligent highlighting, intelligent packaging, digital human synthesis, etc. In terms of price, the monthly fee for clipping SVIP is 79 yuan, the annual fee is 599 yuan (note on this site: equivalent to 49.9 yuan per month), the continuous monthly subscription is 59 yuan per month, and the continuous annual subscription is 499 yuan per year (equivalent to 41.6 yuan per month) . In addition, the cut official also stated that in order to improve the user experience, those who have subscribed to the original VIP

Context-augmented AI coding assistant using Rag and Sem-Rag Context-augmented AI coding assistant using Rag and Sem-Rag Jun 10, 2024 am 11:08 AM

Improve developer productivity, efficiency, and accuracy by incorporating retrieval-enhanced generation and semantic memory into AI coding assistants. Translated from EnhancingAICodingAssistantswithContextUsingRAGandSEM-RAG, author JanakiramMSV. While basic AI programming assistants are naturally helpful, they often fail to provide the most relevant and correct code suggestions because they rely on a general understanding of the software language and the most common patterns of writing software. The code generated by these coding assistants is suitable for solving the problems they are responsible for solving, but often does not conform to the coding standards, conventions and styles of the individual teams. This often results in suggestions that need to be modified or refined in order for the code to be accepted into the application

Can fine-tuning really allow LLM to learn new things: introducing new knowledge may make the model produce more hallucinations Can fine-tuning really allow LLM to learn new things: introducing new knowledge may make the model produce more hallucinations Jun 11, 2024 pm 03:57 PM

Large Language Models (LLMs) are trained on huge text databases, where they acquire large amounts of real-world knowledge. This knowledge is embedded into their parameters and can then be used when needed. The knowledge of these models is "reified" at the end of training. At the end of pre-training, the model actually stops learning. Align or fine-tune the model to learn how to leverage this knowledge and respond more naturally to user questions. But sometimes model knowledge is not enough, and although the model can access external content through RAG, it is considered beneficial to adapt the model to new domains through fine-tuning. This fine-tuning is performed using input from human annotators or other LLM creations, where the model encounters additional real-world knowledge and integrates it

Seven Cool GenAI & LLM Technical Interview Questions Seven Cool GenAI & LLM Technical Interview Questions Jun 07, 2024 am 10:06 AM

To learn more about AIGC, please visit: 51CTOAI.x Community https://www.51cto.com/aigc/Translator|Jingyan Reviewer|Chonglou is different from the traditional question bank that can be seen everywhere on the Internet. These questions It requires thinking outside the box. Large Language Models (LLMs) are increasingly important in the fields of data science, generative artificial intelligence (GenAI), and artificial intelligence. These complex algorithms enhance human skills and drive efficiency and innovation in many industries, becoming the key for companies to remain competitive. LLM has a wide range of applications. It can be used in fields such as natural language processing, text generation, speech recognition and recommendation systems. By learning from large amounts of data, LLM is able to generate text

Samsung introduces BM1743 data center-grade SSD: equipped with v7 QLC V-NAND and supports PCIe 5.0 Samsung introduces BM1743 data center-grade SSD: equipped with v7 QLC V-NAND and supports PCIe 5.0 Jun 18, 2024 pm 04:15 PM

According to news from this website on June 18, Samsung Semiconductor recently introduced its next-generation data center-grade solid-state drive BM1743 equipped with its latest QLC flash memory (v7) on its technology blog. ▲Samsung QLC data center-grade solid-state drive BM1743 According to TrendForce in April, in the field of QLC data center-grade solid-state drives, only Samsung and Solidigm, a subsidiary of SK Hynix, had passed the enterprise customer verification at that time. Compared with the previous generation v5QLCV-NAND (note on this site: Samsung v6V-NAND does not have QLC products), Samsung v7QLCV-NAND flash memory has almost doubled the number of stacking layers, and the storage density has also been greatly improved. At the same time, the smoothness of v7QLCV-NAND

Five schools of machine learning you don't know about Five schools of machine learning you don't know about Jun 05, 2024 pm 08:51 PM

Machine learning is an important branch of artificial intelligence that gives computers the ability to learn from data and improve their capabilities without being explicitly programmed. Machine learning has a wide range of applications in various fields, from image recognition and natural language processing to recommendation systems and fraud detection, and it is changing the way we live. There are many different methods and theories in the field of machine learning, among which the five most influential methods are called the "Five Schools of Machine Learning". The five major schools are the symbolic school, the connectionist school, the evolutionary school, the Bayesian school and the analogy school. 1. Symbolism, also known as symbolism, emphasizes the use of symbols for logical reasoning and expression of knowledge. This school of thought believes that learning is a process of reverse deduction, through existing

To provide a new scientific and complex question answering benchmark and evaluation system for large models, UNSW, Argonne, University of Chicago and other institutions jointly launched the SciQAG framework To provide a new scientific and complex question answering benchmark and evaluation system for large models, UNSW, Argonne, University of Chicago and other institutions jointly launched the SciQAG framework Jul 25, 2024 am 06:42 AM

Editor |ScienceAI Question Answering (QA) data set plays a vital role in promoting natural language processing (NLP) research. High-quality QA data sets can not only be used to fine-tune models, but also effectively evaluate the capabilities of large language models (LLM), especially the ability to understand and reason about scientific knowledge. Although there are currently many scientific QA data sets covering medicine, chemistry, biology and other fields, these data sets still have some shortcomings. First, the data form is relatively simple, most of which are multiple-choice questions. They are easy to evaluate, but limit the model's answer selection range and cannot fully test the model's ability to answer scientific questions. In contrast, open-ended Q&A

SOTA performance, Xiamen multi-modal protein-ligand affinity prediction AI method, combines molecular surface information for the first time SOTA performance, Xiamen multi-modal protein-ligand affinity prediction AI method, combines molecular surface information for the first time Jul 17, 2024 pm 06:37 PM

Editor | KX In the field of drug research and development, accurately and effectively predicting the binding affinity of proteins and ligands is crucial for drug screening and optimization. However, current studies do not take into account the important role of molecular surface information in protein-ligand interactions. Based on this, researchers from Xiamen University proposed a novel multi-modal feature extraction (MFE) framework, which for the first time combines information on protein surface, 3D structure and sequence, and uses a cross-attention mechanism to compare different modalities. feature alignment. Experimental results demonstrate that this method achieves state-of-the-art performance in predicting protein-ligand binding affinities. Furthermore, ablation studies demonstrate the effectiveness and necessity of protein surface information and multimodal feature alignment within this framework. Related research begins with "S

See all articles