Home > Technology peripherals > AI > body text

Who says Apple is falling behind? AI was not mentioned at WWDC, but large models were introduced in a low-key manner

PHPz
Release: 2023-06-06 22:20:46
forward
1449 people have browsed it

At WWDC, Apple did not mention a word about "artificial intelligence (AI)", as well as some of the more popular terms in the current technology world such as "ChatGPT".

What Apple did was simply mention “machine learning (ML)” 7 times.

Even when introducing the Vision Pro, the AR glasses they have been preparing for 7 years, it was only stated as"using an advanced encoding-decoding neural network".

Who says Apple is falling behind? AI was not mentioned at WWDC, but large models were introduced in a low-key manner

This is completely different from the "high-profile" approach of major Silicon Valley companies such as Microsoft and Google in the current wave of large models (or what can be called "AI Hype").

Could it be that, as some experts and media have said, Apple has fallen behind in this AI competition? Or are they still waiting and watching? Actually, not so.

Although Apple didn't talk about (or even tout) big AI models at WWDC, they introduced some new AI-based features, such as improved iPhone autocorrect when you press the space bar , which can complete a word or an entire sentence.

The feature is based on ML programs that use the Transformer language model, making autocorrect more accurate than ever, and Transformer is one of the important technologies supporting ChatGPT.

Apple says it will even learn how users text and type to make it better.

Who says Apple is falling behind? AI was not mentioned at WWDC, but large models were introduced in a low-key manner

According to reports, the new "AutoCorrect" is powered by on-device machine learning, Apple has been continuously improving these models over the years... With the power of Apple Silicon, The iPhone can run this model every time the user taps a key.

"For those moments when you just want to type a ducking word, the keyboard learns on its own," said Craig Federighi, Apple's senior vice president of software engineering.

Who says Apple is falling behind? AI was not mentioned at WWDC, but large models were introduced in a low-key manner

Another example is Apple’s improvement to AirPods Pro, which “automatically turns off the noise reduction function when the headset detects user conversation” . Apple doesn't make this a machine learning feature, but it's a hard problem to solve, and the solution is based on AI models.

In addition, new features such as identifying the fields to be filled in in the PDF and identifying your pet (and then grouping all the photos of the pet in a folder) are also based on Apple's neural network research work in this area.

At WWDC, Apple did not talk about specific AI models, or training data, or possible future improvements, but simply mentioned that "these features are supported by cool technology."

Unlike its competitors, which use server clusters, supercomputers and terabytes of data to build larger models, Apple wants to build AI models on its devices.

Features like the new "AutoCorrect" are based on this idea and run on the iPhone, while models like ChatGPT need to be trained by hundreds of expensive GPUs.

The advantage of this is that , running on-device AI bypasses many of the data privacy issues faced by cloud-based AI. When a model can run on a phone, Apple only needs to collect less data to run it.

It is worth noting that Apple also announced the latest member of the M2 chip family - M2 Ultra. It is built on a second-generation 5nm process and has up to 24 CPU cores, 76 GPU cores, and a 32-core Neural Engine capable of 31.6 trillion operations per second.

Who says Apple is falling behind? AI was not mentioned at WWDC, but large models were introduced in a low-key manner

Apple says

this capability may come in handy when training "large Transformer models."

"M2 Ultra can support up to 192GB of unified memory, 50% more than M1 Ultra, which enables it to complete tasks that other chips cannot. For example, in a single system, it can train huge ML workloads, Such as large Transformer models, such models cannot be processed by even the most powerful discrete GPU because of insufficient memory."

The advent of M2 Ultra has excited some artificial intelligence experts.

“Whether by accident or by design, Apple’s Silicon Unified Memory architecture means that high-end Macs are now truly amazing machines for running large AI models and conducting AI research,” Perry E. Metzger said on Twitter. There really aren't many other systems at this price that offer 192GB of GPU-accessible memory."

Who says Apple is falling behind? AI was not mentioned at WWDC, but large models were introduced in a low-key manner

Greater memory means larger, more capable AI models can fit into memory, potentially giving many people the opportunity to train AI on their personal computers.

Although there is no performance evaluation of M2 Ultra and A100 (or even H100), at least for now, Apple has publicly entered the field of generative AI training hardware.

The above is the detailed content of Who says Apple is falling behind? AI was not mentioned at WWDC, but large models were introduced in a low-key manner. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:sohu.com
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template