Home > Technology peripherals > AI > Using Apple Vision Pro to control robots from a distance, NVIDIA: It's not difficult to 'integrate man and machine”

Using Apple Vision Pro to control robots from a distance, NVIDIA: It's not difficult to 'integrate man and machine”

WBOY
Release: 2024-08-01 03:16:33
Original
1209 people have browsed it
Huang Renxun said: "The next wave of AI is robots, and one of the most exciting developments is humanoid robots." Today, Project GR00T has taken another important step.

Yesterday, NVIDIA founder Huang Jensen talked about its universal basic model of humanoid robot "Project GR00T" in his SIGGRAPH 2024 Keynote speech. The model receives a series of updates in terms of functionality.

Assistant Professor at the University of Texas at Austin and NVIDIA Senior Research Scientist Yuke Zhu tweeted, demonstrating in the video how NVIDIA integrates the general household robot large-scale simulation training framework RoboCasa and MimicGen system into the NVIDIA Omniverse platform and Isaac robot development platform .

用苹果Vision Pro隔空操控机器人,英伟达:「人机合一」也不难嘛

                                                                                                           Covers Nvidia's three computing platforms, including AI, Omniverse and Jetson Thor, Leverage them to simplify and accelerate developer workflows. Through the joint empowerment of these computing platforms, we are expected to enter the era of humanoid robots driven by physical AI.

The biggest highlight is that developers can use Apple Vision Pro to remotely control humanoid robots to perform tasks.

用苹果Vision Pro隔空操控机器人,英伟达:「人机合一」也不难嘛Meanwhile, another Nvidia senior research scientist, Jim Fan, said the updates to Project GR00T are exciting. NVIDIA uses a systematic approach to scaling robotics data to solve the toughest challenges in robotics.

用苹果Vision Pro隔空操控机器人,英伟达:「人机合一」也不难嘛用苹果Vision Pro隔空操控机器人,英伟达:「人机合一」也不难嘛用苹果Vision Pro隔空操控机器人,英伟达:「人机合一」也不难嘛The idea is also very simple: humans collect demonstration data on real robots, and NVIDIA expands these data by a thousand times or more in simulation. With GPU-accelerated simulation, people can now exchange computing power for the time-consuming, labor-intensive and costly work of humans collecting data.

He talks about how not so long ago he thought remote control was fundamentally unscalable because in the atomic world we were always limited to 24 hours/robots/days. NVIDIA's new synthetic data pipeline on GR00T breaks this limitation in the bit world.

                                                                                                

用苹果Vision Pro隔空操控机器人,英伟达:「人机合一」也不难嘛

Regarding Nvidia’s latest progress in the field of humanoid robots, some netizens said that Apple Vision Pro has found the best solution. Cool use case.

NVIDIA is beginning to lead the next wave: physical AI用苹果Vision Pro隔空操控机器人,英伟达:「人机合一」也不难嘛用苹果Vision Pro隔空操控机器人,英伟达:「人机合一」也不难嘛

NVIDIA also detailed the technical process of accelerating humanoid robots in a blog. The full content is as follows:

To accelerate the development of humanoid robots worldwide, NVIDIA announced a set of services, models and computing platforms for the world's leading robot manufacturers, AI model developers and software manufacturers to develop, train and build the next generation of humanoid robots.
This suite of products includes new NVIDIA NIM microservices and frameworks for robotics simulation and learning, NVIDIA OSMO orchestration services for running multi-stage robotics workloads, and AI and simulation-enabled remote operations workflows that allow development Researchers use small amounts of human demonstration data to train robots.

Jensen Huang said: "The next wave of AI is robots, and one of the most exciting developments is humanoid robots. We are advancing the development of the entire NVIDIA robot stack, open to humanoid robot developers and companies around the world Access, allowing them to use the platforms, acceleration libraries and AI models that best fit their needs.

用苹果Vision Pro隔空操控机器人,英伟达:「人机合一」也不难嘛

Accelerate development with NVIDIA NIM and OSMO

NIM microservices are powered by NVIDIA inference software. of pre-built containers that enable developers to reduce deployment time from weeks to minutes.

Two new AI microservices will allow robotics experts to enhance generative physics AI simulation workflows in NVIDIA Isaac Sim.

The MimicGen NIM microservice generates synthetic motion data from remote data recorded from spatial computing devices such as the Apple Vision Pro. Robocasa NIM microservices generate robotic tasks and simulation environments in OpenUSD.

Cloud-native managed service NVIDIA OSMO is now available, allowing users to orchestrate and scale complex robotics development workflows across distributed computing resources, whether on-premises or in the cloud. The emergence of OSMO greatly simplifies robot training and simulation workflows, shortening deployment and development cycles from months to less than a week.

Provides advanced data capture workflow for humanoid robot developers

Training the underlying models behind humanoid robots requires a large amount of data. One way to obtain human demonstration data is to use remote operations, but this is becoming increasingly expensive and lengthy.

Through the NVIDIA AI and Omniverse remote manipulation reference workflow demonstrated at the SIGGRAPH computer graphics conference, researchers and AI developers can generate large amounts of synthetic motion and perception data from a very small number of remotely captured human demonstrations.

用苹果Vision Pro隔空操控机器人,英伟达:「人机合一」也不难嘛

First, developers used Apple Vision Pro to capture a handful of remote demos. They then simulated the recordings in NVIDIA Isaac Sim and used the MimicGen NIM microservice to generate synthetic datasets from the recordings.

Developers use real and synthetic data to train the Project GR00T humanoid robot base model, saving a lot of time and reducing costs. They then used the Robocasa NIM microservice in Isaac Lab, a robot learning framework, to generate experiences to retrain the robot model. Throughout the entire workflow, NVIDIA OSMO seamlessly distributes computing tasks to different resources, saving developers weeks of administrative workload.

Expand access to NVIDIA humanoid robot developer technology

NVIDIA offers three computing platforms to simplify the development of humanoid robots: NVIDIA AI supercomputer for training models; built on Omniverse the NVIDIA Isaac Sim, which allows robots to learn and perfect skills in a simulated world; and the NVIDIA Jetson Thor humanoid robotics computer used to run the model. Developers can access and use all or parts of the platform based on their specific needs.

Through the new NVIDIA Humanoid Developer Program, developers can gain early access to new products and the latest versions of NVIDIA Isaac Sim, NVIDIA Isaac Lab, Jetson Thor and Project GR00T universal humanoid base models.

1x, Boston Dynamics, ByteDance, Field AI, Figure, Fourier, Galbot, LimX Dynamics, Mentee, Neura Robotics, RobotEra and Skild AI are the first companies to join the early access program.

Developers can now join the NVIDIA Humanoid Developer Program to gain access to NVIDIA OSMO and Isaac Lab, and will soon gain access to NVIDIA NIM microservices.

Blog link:
https://nvidianews.nvidia.com/news/nvidia-accelerates-worldwide-humanoid-robotics-development

The above is the detailed content of Using Apple Vision Pro to control robots from a distance, NVIDIA: It's not difficult to 'integrate man and machine”. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:jiqizhixin.com
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template