AI chips are out of stock globally!
Google’s CEO likened the AI revolution to humanity’s use of fire, but now the digital fire that fuels the industry—AI chips—is hard to come by.
The new generation of advanced chips that drive AI operations are almost all manufactured by NVIDIA. As ChatGPT explodes out of the circle, the market demand for NVIDIA graphics processing chips (GPUs) far exceeds the supply.Sharon Zhou, co-founder and CEO of Lamini, a startup that helps companies build AI models such as chatbots, said:
"Because there is a shortage, the key is your circle of friends.""
It's like toilet paper during the epidemic. "
Even the world’s most well-connected tech entrepreneurs are working to secure supply. At the AI Congressional hearing on May 16, OpenAI CEO Sam Altman said that due to computing power bottlenecks, it would be better if fewer people used ChatGPT.
On May 23, Tesla CEO Musk said at the Wall Street Journal CEO Council Summit:
“However, Musk is still much better.Currently, GPUs are harder to obtain than drugs.”
Startups that said they were seeking Oracle's computing power earlier this year were suddenly told that a buyer had snapped up much of Oracle's spare server space, the Wall Street Journal reported. According to people familiar with the matter, these new startups have been informed that the buyer is Musk, who is developing a competing product called X.AI to challenge OpenAI’s market position.
Without advanced graphics processing chips, the execution speed of large language models in artificial intelligence will be greatly reduced. This is a common view among the founders of startup companies. Nvidia's advanced graphics processing chips have excellent parallel computing capabilities, which is very critical for the operation of AI.
UBS analysts estimate that an early version of ChatGPT will require approximately 10,000 NVIDIA graphics processing chips; while Musk estimates that an updated version will require 3 to 5 times that number of advanced processors.
Nvidia recently said it was expanding supply to meet growing demand.
Nvidia CEO Jensen Huang said on Sunday that the company has increased production of its new flagship chip H100 for generative artificial intelligence.
AI startups and investors are doing their best to solve the chip shortage problem. Some investors are combing their computing networks for spare computing power, while others are arranging orders for high-capacity processors and servers to share with other AI startups.Some startups are shrinking their AI models to increase efficiency; other founders are seeking to connect with salespeople at Amazon and Microsoft.
Sharon Zhou from Lamini stated that they have the necessary chips.. Lamini was co-founded with a former NVIDIA engineer. But she and many other founders declined to say how they obtained the chips.
Many AI founders expect the shortage of AI chips to last at least until next year.
Founders and investors say thatEven if there is an established company for AI chips, they still have to wait weeks to use them. The CEO of an AI startup said:
"Even if you have paid in advance, it does not mean that the GPU will be delivered to you the next day or next week, you can only wait."The CEO of Supermicro, one of the world's largest server makers, said the company's backlog of graphics chip systems has reached an all-time high and the company is rushing to increase production capacity.
The secondary market is on fire, and Nvidia is also going crazy
The shortage has ignited the secondary market for AI chips, some of which involve large encryption companies. These companies purchased chips for mining during the boom of the currency circle, but now no longer do so during the downturn of the digital currency market. Chip required.
Demand for Nvidia’s products has driven the company’s stock up about 167% this year. AI chip costs vary, Some retailers are selling Nvidia's advanced AI chips for about $33,000, although they may sell for more on the secondary market due to high demand.
Kanjun Qiu, CEO of artificial intelligence research firm General Intelligent, has been buying advanced graphics chips for her servers since last year, allowing her to weather the current shortage.A venture capitalist recently sent her a message asking if there was any excess capacity that could be leased to other startups. Qiu hasn't decided yet whether to give up her chip.
The above is the detailed content of AI chips are out of stock globally!. For more information, please follow other related articles on the PHP Chinese website!

Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

Hot Topics

According to news on November 14, Nvidia officially released the new H200 GPU at the "Supercomputing23" conference on the morning of the 13th local time, and updated the GH200 product line. Among them, the H200 is still built on the existing Hopper H100 architecture. However, more high-bandwidth memory (HBM3e) has been added to better handle the large data sets required to develop and implement artificial intelligence, making the overall performance of running large models improved by 60% to 90% compared to the previous generation H100. The updated GH200 will also power the next generation of AI supercomputers. In 2024, more than 200 exaflops of AI computing power will be online. H200

On June 19, according to media reports in Taiwan, China, Google (Google) has approached MediaTek to cooperate in order to develop the latest server-oriented AI chip, and plans to hand it over to TSMC's 5nm process for foundry, with plans for mass production early next year. According to the report, sources revealed that this cooperation between Google and MediaTek will provide MediaTek with serializer and deserializer (SerDes) solutions and help integrate Google’s self-developed tensor processor (TPU) to help Google create the latest Server AI chips will be more powerful than CPU or GPU architectures. The industry points out that many of Google's current services are related to AI. It has invested in deep learning technology many years ago and found that using GPUs to perform AI calculations is very expensive. Therefore, Google decided to

After the debut of the NVIDIA H200, known as the world's most powerful AI chip, the industry began to look forward to NVIDIA's more powerful B100 chip. At the same time, OpenAI, the most popular AI start-up company this year, has begun to develop a more powerful and complex GPT-5 model. Guotai Junan pointed out in the latest research report that the B100 and GPT5 with boundless performance are expected to be released in 2024, and the major upgrades may release unprecedented productivity. The agency stated that it is optimistic that AI will enter a period of rapid development and its visibility will continue until 2024. Compared with previous generations of products, how powerful are B100 and GPT-5? Nvidia and OpenAI have already given a preview: B100 may be more than 4 times faster than H100, and GPT-5 may achieve super

KL730's progress in energy efficiency has solved the biggest bottleneck in the implementation of artificial intelligence models - energy costs. Compared with the industry and previous Nerner chips, the KL730 chip has increased by 3 to 4 times. The KL730 chip supports the most advanced lightweight GPT large Language models, such as nanoGPT, and provide effective computing power of 0.35-4 tera per second. AI company Kneron today announced the release of the KL730 chip, which integrates automotive-grade NPU and image signal processing (ISP) to bring safe and low-energy AI The capabilities are empowered in various application scenarios such as edge servers, smart homes, and automotive assisted driving systems. San Diego-based Kneron is known for its groundbreaking neural processing units (NPUs), and its latest chip, the KL730, aims to achieve

While the world is still obsessed with NVIDIA H100 chips and buying them crazily to meet the growing demand for AI computing power, on Monday local time, NVIDIA quietly launched its latest AI chip H200, which is used for training large AI models. Compared with other The performance of the previous generation products H100 and H200 has been improved by about 60% to 90%. The H200 is an upgraded version of the Nvidia H100. It is also based on the Hopper architecture like the H100. The main upgrade includes 141GB of HBM3e video memory, and the video memory bandwidth has increased from the H100's 3.35TB/s to 4.8TB/s. According to Nvidia’s official website, H200 is also the company’s first chip to use HBM3e memory. This memory is faster and has larger capacity, so it is more suitable for large languages.

According to the original words, it can be rewritten as: (Global TMT August 16, 2023) AI company Kneron, headquartered in San Diego and known for its groundbreaking neural processing units (NPU), announced the release of the KL730 chip. The chip integrates automotive-grade NPU and image signal processing (ISP), and provides safe and low-energy AI capabilities to various application scenarios such as edge servers, smart homes, and automotive assisted driving systems. The KL730 chip has achieved great results in terms of energy efficiency. A breakthrough, compared with previous Nerner chips, its energy efficiency has increased by 3 to 4 times, and is 150% to 200% higher than similar products in major industries. The chip has an effective computing power of 0.35-4 tera per second and can support the most advanced lightweight GPT large

Dimensity 9300 has recently achieved a leading position in the field of flagship mobile phone chips, successfully surpassing its competitors. In the sub-flagship market with high shipment volume, MediaTek has shown new competitiveness. On the afternoon of November 21, MediaTek officially released the new generation of sub-flagship chip Dimensity 8300. The new generation of chips has been greatly improved in terms of performance, energy efficiency and generative AI, achieving a flagship-level experience. At the event, Dr. Li Yanji, deputy general manager of MediaTek Wireless Communications Division, said: "Dimensity 8300 has highly energy-efficient end-side AI capabilities, supports flagship-level storage, and provides excellent gaming, imaging, and multimedia entertainment experiences with a comprehensive Platform innovation will open up more new opportunities for the high-end smartphone market." Dimensity 8300 adopts 4

Google’s CEO likened the AI revolution to humanity’s use of fire, but now the digital fire that fuels the industry—AI chips—is hard to come by. The new generation of advanced chips that drive AI operations are almost all manufactured by NVIDIA. As ChatGPT explodes out of the circle, the market demand for NVIDIA graphics processing chips (GPUs) far exceeds the supply. "Because there is a shortage, the key is your circle of friends," said Sharon Zhou, co-founder and CEO of Lamini, a startup that helps companies build AI models such as chatbots. "It's like toilet paper during the epidemic." This kind of thing. The situation has limited the computing power that cloud service providers such as Amazon and Microsoft can provide to customers such as OpenAI, the creator of ChatGPT.
