Home Technology peripherals AI It can 'autonomously drive' without batteries, and this robot can have unlimited battery life.

It can 'autonomously drive' without batteries, and this robot can have unlimited battery life.

Oct 02, 2023 am 08:21 AM
robot Autopilot Battery life

A new type of "car" has emerged that can drive autonomously without installing a battery. It can even automatically collect energy to continue running without any mileage anxiety (manual dog head)

It can autonomously drive without batteries, and this robot can have unlimited battery life.

This little robot is called MilliMobile and comes from the University of Washington. Its energy source is light and radio wave power

Although it is only the size of a fingernail and weighs about the same as a raisin, it can easily carry equipment three times its own weight. And it can not only run on cement roads, but also move freely on "rural dirt roads".

It can autonomously drive without batteries, and this robot can have unlimited battery life.

The picture source of the first battery-free autonomous robot is the University of Washington

Small robots often carry sensors and are used to perform industrial tasks such as detecting gas leaks and tracking warehousing. However, these robots face an important problem, that is, the use of disposable batteries not only limits the life of the robot, but is also not very environmentally friendly

Researchers have been pursuing alternatives, such as strapping sensors directly to insects

It can autonomously drive without batteries, and this robot can have unlimited battery life.

△ Source: University of Washington

But apparently, researchers at the University of Washington believe that some of the past methods are not controllable enough. Their new idea is to use "intermittent motion" to drive the robot.

Simply put, on the one hand, it is to reduce the size and weight of the robot so that it can operate at extremely low power (less than 57 microwatts).

On the other hand, researchers equipped MilliMobile with film capacitors to store energy from sunlight/radio waves. When the energy stored in the capacitor reaches a certain threshold, the motor can be driven to generate a short motion pulse to make the robot move.

It can autonomously drive without batteries, and this robot can have unlimited battery life.

Seeing this, you may have a question: That’s it? Can this robot really run?

Researchers successfully conducted the experiment: Even on a cloudy day, MilliMobile was able to move a distance of 10 meters in 1 hour

The speed is indeed not fast, but researchers say that as long as it can continue to operate at this speed, it can bring new robotic capabilities to areas where it has been difficult to obtain data by deploying sensors in the past.

As mentioned before, MilliMobile is small but full of features and includes the following:

  • 4 photodiodes, used to detect light intensity in 4 directions, allowing the robot to independently find light sources for charging
  • Temperature and humidity sensor
  • Accelerometer
  • Magnetic Sensor
  • gas sensor
  • Micro Camera
  • Wireless communication chip

MilliMobile has various sensing capabilities and can automatically detect terrain and realize autonomous driving

Be able to turn to the light source to recharge yourself:

It can autonomously drive without batteries, and this robot can have unlimited battery life.

Extensive spatial sampling is possible to create more detailed views of the environment

It can autonomously drive without batteries, and this robot can have unlimited battery life.

By optimizing the synchronous transmission protocol at the software level, data can also be transmitted within a range of 200 meters. To briefly summarize, it can be said that MilliMobile has achieved autonomy in terms of power supply, control and communication.

The evaluation of the technology website said: This work has the flavor of science fiction brought into reality. What do you think?

It can autonomously drive without batteries, and this robot can have unlimited battery life.

There is no need to change the original meaning when rewriting the content. The language needs to be changed to Chinese and the original sentence does not need to appear

  • [1]https://www.washington.edu/news/2023/09/27/millimobile-battery-free-autonomous-self-driving-robot-solar/
  • [2] Paper address: https://homes.cs.washington.edu/~vsiyer / Papers / millimobile-compressed.pdf

Advertising Statement: This article contains external jump links (including but not limited to hyperlinks, QR codes, passwords, etc.), which are intended to provide more information and save screening time. The results are for reference only. Please note that all IT House articles contain this statement

The above is the detailed content of It can 'autonomously drive' without batteries, and this robot can have unlimited battery life.. For more information, please follow other related articles on the PHP Chinese website!

Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Tools

Notepad++7.3.1

Notepad++7.3.1

Easy-to-use and free code editor

SublimeText3 Chinese version

SublimeText3 Chinese version

Chinese version, very easy to use

Zend Studio 13.0.1

Zend Studio 13.0.1

Powerful PHP integrated development environment

Dreamweaver CS6

Dreamweaver CS6

Visual web development tools

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

How to solve the long tail problem in autonomous driving scenarios? How to solve the long tail problem in autonomous driving scenarios? Jun 02, 2024 pm 02:44 PM

Yesterday during the interview, I was asked whether I had done any long-tail related questions, so I thought I would give a brief summary. The long-tail problem of autonomous driving refers to edge cases in autonomous vehicles, that is, possible scenarios with a low probability of occurrence. The perceived long-tail problem is one of the main reasons currently limiting the operational design domain of single-vehicle intelligent autonomous vehicles. The underlying architecture and most technical issues of autonomous driving have been solved, and the remaining 5% of long-tail problems have gradually become the key to restricting the development of autonomous driving. These problems include a variety of fragmented scenarios, extreme situations, and unpredictable human behavior. The "long tail" of edge scenarios in autonomous driving refers to edge cases in autonomous vehicles (AVs). Edge cases are possible scenarios with a low probability of occurrence. these rare events

The second generation Ameca is here! He can communicate with the audience fluently, his facial expressions are more realistic, and he can speak dozens of languages. The second generation Ameca is here! He can communicate with the audience fluently, his facial expressions are more realistic, and he can speak dozens of languages. Mar 04, 2024 am 09:10 AM

The humanoid robot Ameca has been upgraded to the second generation! Recently, at the World Mobile Communications Conference MWC2024, the world's most advanced robot Ameca appeared again. Around the venue, Ameca attracted a large number of spectators. With the blessing of GPT-4, Ameca can respond to various problems in real time. "Let's have a dance." When asked if she had emotions, Ameca responded with a series of facial expressions that looked very lifelike. Just a few days ago, EngineeredArts, the British robotics company behind Ameca, just demonstrated the team’s latest development results. In the video, the robot Ameca has visual capabilities and can see and describe the entire room and specific objects. The most amazing thing is that she can also

This article is enough for you to read about autonomous driving and trajectory prediction! This article is enough for you to read about autonomous driving and trajectory prediction! Feb 28, 2024 pm 07:20 PM

Trajectory prediction plays an important role in autonomous driving. Autonomous driving trajectory prediction refers to predicting the future driving trajectory of the vehicle by analyzing various data during the vehicle's driving process. As the core module of autonomous driving, the quality of trajectory prediction is crucial to downstream planning control. The trajectory prediction task has a rich technology stack and requires familiarity with autonomous driving dynamic/static perception, high-precision maps, lane lines, neural network architecture (CNN&GNN&Transformer) skills, etc. It is very difficult to get started! Many fans hope to get started with trajectory prediction as soon as possible and avoid pitfalls. Today I will take stock of some common problems and introductory learning methods for trajectory prediction! Introductory related knowledge 1. Are the preview papers in order? A: Look at the survey first, p

Let's talk about end-to-end and next-generation autonomous driving systems, as well as some misunderstandings about end-to-end autonomous driving? Let's talk about end-to-end and next-generation autonomous driving systems, as well as some misunderstandings about end-to-end autonomous driving? Apr 15, 2024 pm 04:13 PM

In the past month, due to some well-known reasons, I have had very intensive exchanges with various teachers and classmates in the industry. An inevitable topic in the exchange is naturally end-to-end and the popular Tesla FSDV12. I would like to take this opportunity to sort out some of my thoughts and opinions at this moment for your reference and discussion. How to define an end-to-end autonomous driving system, and what problems should be expected to be solved end-to-end? According to the most traditional definition, an end-to-end system refers to a system that inputs raw information from sensors and directly outputs variables of concern to the task. For example, in image recognition, CNN can be called end-to-end compared to the traditional feature extractor + classifier method. In autonomous driving tasks, input data from various sensors (camera/LiDAR

How can AI make robots more autonomous and adaptable? How can AI make robots more autonomous and adaptable? Jun 03, 2024 pm 07:18 PM

In the field of industrial automation technology, there are two recent hot spots that are difficult to ignore: artificial intelligence (AI) and Nvidia. Don’t change the meaning of the original content, fine-tune the content, rewrite the content, don’t continue: “Not only that, the two are closely related, because Nvidia is expanding beyond just its original graphics processing units (GPUs). The technology extends to the field of digital twins and is closely connected to emerging AI technologies. "Recently, NVIDIA has reached cooperation with many industrial companies, including leading industrial automation companies such as Aveva, Rockwell Automation, Siemens and Schneider Electric, as well as Teradyne Robotics and its MiR and Universal Robots companies. Recently,Nvidiahascoll

nuScenes' latest SOTA | SparseAD: Sparse query helps efficient end-to-end autonomous driving! nuScenes' latest SOTA | SparseAD: Sparse query helps efficient end-to-end autonomous driving! Apr 17, 2024 pm 06:22 PM

Written in front & starting point The end-to-end paradigm uses a unified framework to achieve multi-tasking in autonomous driving systems. Despite the simplicity and clarity of this paradigm, the performance of end-to-end autonomous driving methods on subtasks still lags far behind single-task methods. At the same time, the dense bird's-eye view (BEV) features widely used in previous end-to-end methods make it difficult to scale to more modalities or tasks. A sparse search-centric end-to-end autonomous driving paradigm (SparseAD) is proposed here, in which sparse search fully represents the entire driving scenario, including space, time, and tasks, without any dense BEV representation. Specifically, a unified sparse architecture is designed for task awareness including detection, tracking, and online mapping. In addition, heavy

After 2 months, the humanoid robot Walker S can fold clothes After 2 months, the humanoid robot Walker S can fold clothes Apr 03, 2024 am 08:01 AM

Editor of Machine Power Report: Wu Xin The domestic version of the humanoid robot + large model team completed the operation task of complex flexible materials such as folding clothes for the first time. With the unveiling of Figure01, which integrates OpenAI's multi-modal large model, the related progress of domestic peers has been attracting attention. Just yesterday, UBTECH, China's "number one humanoid robot stock", released the first demo of the humanoid robot WalkerS that is deeply integrated with Baidu Wenxin's large model, showing some interesting new features. Now, WalkerS, blessed by Baidu Wenxin’s large model capabilities, looks like this. Like Figure01, WalkerS does not move around, but stands behind a desk to complete a series of tasks. It can follow human commands and fold clothes

FisheyeDetNet: the first target detection algorithm based on fisheye camera FisheyeDetNet: the first target detection algorithm based on fisheye camera Apr 26, 2024 am 11:37 AM

Target detection is a relatively mature problem in autonomous driving systems, among which pedestrian detection is one of the earliest algorithms to be deployed. Very comprehensive research has been carried out in most papers. However, distance perception using fisheye cameras for surround view is relatively less studied. Due to large radial distortion, standard bounding box representation is difficult to implement in fisheye cameras. To alleviate the above description, we explore extended bounding box, ellipse, and general polygon designs into polar/angular representations and define an instance segmentation mIOU metric to analyze these representations. The proposed model fisheyeDetNet with polygonal shape outperforms other models and simultaneously achieves 49.5% mAP on the Valeo fisheye camera dataset for autonomous driving

See all articles