Table of Contents
Opportunities for the development of autonomous driving
Stronger perception capabilities, effectively alleviating urban congestion
Vehicle detection capabilities are limited and the weather is greatly affected
The commercialization of L3 level autonomous driving is accelerating
Home Technology peripherals AI Large-scale commercial use is imminent, and autonomous driving has a 'big future”

Large-scale commercial use is imminent, and autonomous driving has a 'big future”

Apr 09, 2023 am 11:11 AM
Internet of things Autopilot sensor

​Autonomous driving has now become one of the key development directions of the automotive industry, and it is also a technological proposition that the entire society pays close attention to. While mainstream car manufacturers are exploring autonomous driving technology, the development of this industry has also attracted the attention of some scholars.

Large-scale commercial use is imminent, and autonomous driving has a 'big future”

On the evening of June 17, the famous economist Ren Zeping's speech with the theme of "Ignite Hope - Looking for New Opportunities for China's Economy" was broadcast on Beijing Satellite TV. New economy, new infrastructure, new opportunities and other economic topics that everyone is generally concerned about. In the program, economist Ren Zeping believed that autonomous driving “has a great future.”

Opportunities for the development of autonomous driving

Driven by the "double carbon" goal, the electric vehicle industry is developing very rapidly. As of the end of 2021, the number of new energy vehicles in the country reached 7.84 million. The electrification of the automotive industry has become the soil for the growth of autonomous driving technology. In addition, the implementation of policies also provides direction for the development of the autonomous driving industry.

Stronger perception capabilities, effectively alleviating urban congestion

Autonomous driving has obvious advantages over traditional driving in many aspects. The first point is that autonomous driving has stronger perception and shorter reaction time than human driving. In addition, it can connect to the Internet of Everything and enhance the safety services of the Internet of Vehicles. There is no human driver behavior such as fatigue driving, and the safety factor is higher when driving long distances.

The second point is that it can save human resource costs. With the help of autonomous driving technology, people can be freed from arduous driving tasks, and a lot of time freed up can be used to create more social value.

The three points are to alleviate urban traffic congestion. Traffic congestion has become a problem in the development of many cities. In addition to the increase in urban vehicles, traffic congestion is also related to improper driving behavior of drivers. At a technical level, autonomous driving can effectively reduce traffic jams caused by jams, stalls and other factors. It can also automatically plan the best route based on current road conditions to avoid aggravation of congestion.

Vehicle detection capabilities are limited and the weather is greatly affected

At present, autonomous driving technology is not yet mature. Kobe Marenko, CEO of Israeli sensor startup Arbe Robotics, said that radar resolution and field of view are limited The detection ability of the vehicle is improved, and the performance of the sensor is greatly affected by rain and fog weather. In fact, current technological autonomous driving highly relies on the sensing capabilities of sensors. According to statistics, a smart car generally has anywhere from dozens to hundreds of sensors. These sensors together form the perception network of smart cars and provide technical support for autonomous driving of smart cars.

While technology continues to develop, standards related to autonomous driving are also constantly being improved, and clear responsibilities for autonomous driving are the focus.

In the "Grading of Automobile Driving Automation" implemented in March 2022, autonomous driving levels are divided into five levels, which clarifies the driving responsibilities that drivers should bear at each level. Driving work requires the driver and the driving automation system to work together. The driver should also bear the responsibility for emergencies and intervene in driving when necessary to ensure vehicle safety.

The commercialization of L3 level autonomous driving is accelerating

Since 2020, autonomous taxis have been put into trial operation in multiple intelligent network demonstration zones such as Beijing and Shanghai, attracting many consumers the gaze of the person.

At present, self-driving taxis from SAIC, Baidu, Didi, T3 Travel, Pony.ai, WeRide and many other companies have begun pilot commercial operations. It is worth noting that the driverless technology of these self-driving taxis is still at the L3 level, but some companies are already exploring higher-level autonomous driving technologies. On January 20, 2022, Pony.ai disclosed for the first time the appearance design, sensor and computing platform solutions of the sixth-generation autonomous driving software and hardware system designed for L4 car-level mass production. Road testing will begin in China this year and it is expected to be put into daily operation of self-driving travel services in the first half of 2023.

It is foreseeable that in the future, with the continuous advancement and development of sensors and Internet of Things technology, the promotion of L4 autonomous driving technology, and the continuous improvement of network information security construction, the application scope of autonomous driving will be further expanded. . ​

The above is the detailed content of Large-scale commercial use is imminent, and autonomous driving has a 'big future”. For more information, please follow other related articles on the PHP Chinese website!

Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

AI Hentai Generator

AI Hentai Generator

Generate AI Hentai for free.

Hot Article

R.E.P.O. Energy Crystals Explained and What They Do (Yellow Crystal)
3 weeks ago By 尊渡假赌尊渡假赌尊渡假赌
R.E.P.O. Best Graphic Settings
3 weeks ago By 尊渡假赌尊渡假赌尊渡假赌
R.E.P.O. How to Fix Audio if You Can't Hear Anyone
3 weeks ago By 尊渡假赌尊渡假赌尊渡假赌
WWE 2K25: How To Unlock Everything In MyRise
4 weeks ago By 尊渡假赌尊渡假赌尊渡假赌

Hot Tools

Notepad++7.3.1

Notepad++7.3.1

Easy-to-use and free code editor

SublimeText3 Chinese version

SublimeText3 Chinese version

Chinese version, very easy to use

Zend Studio 13.0.1

Zend Studio 13.0.1

Powerful PHP integrated development environment

Dreamweaver CS6

Dreamweaver CS6

Visual web development tools

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

Wow awesome! Samsung Galaxy Ring experience: 2999 yuan real smart ring Wow awesome! Samsung Galaxy Ring experience: 2999 yuan real smart ring Jul 19, 2024 pm 02:31 PM

Samsung officially released the national version of Samsung Galaxy Ring on July 17, priced at 2,999 yuan. Galaxy Ring's real phone is really the 2024 version of "WowAwesome, this is my exclusive moment". It is the electronic product that makes us feel the freshest in recent years (although it sounds like a flag) besides Apple's Vision Pro. (In the picture, the rings on the left and right are Galaxy Ring↑) Samsung Galaxy Ring specifications (data from the official website of the Bank of China): ZephyrRTOS system, 8MB storage; 10ATM waterproof + IP68; battery capacity 18mAh to 23.5mAh (different sizes

How to solve the long tail problem in autonomous driving scenarios? How to solve the long tail problem in autonomous driving scenarios? Jun 02, 2024 pm 02:44 PM

Yesterday during the interview, I was asked whether I had done any long-tail related questions, so I thought I would give a brief summary. The long-tail problem of autonomous driving refers to edge cases in autonomous vehicles, that is, possible scenarios with a low probability of occurrence. The perceived long-tail problem is one of the main reasons currently limiting the operational design domain of single-vehicle intelligent autonomous vehicles. The underlying architecture and most technical issues of autonomous driving have been solved, and the remaining 5% of long-tail problems have gradually become the key to restricting the development of autonomous driving. These problems include a variety of fragmented scenarios, extreme situations, and unpredictable human behavior. The "long tail" of edge scenarios in autonomous driving refers to edge cases in autonomous vehicles (AVs). Edge cases are possible scenarios with a low probability of occurrence. these rare events

Upgrade to full screen! iPhone SE4 advanced to September Upgrade to full screen! iPhone SE4 advanced to September Jul 24, 2024 pm 12:56 PM

Recently, new news about iPhone SE4 was revealed on Weibo. It is said that the back cover process of iPhone SE4 is exactly the same as that of the iPhone 16 standard version. In other words, iPhone SE4 will use a glass back panel and a straight screen and straight edge design. It is reported that iPhone SE4 will be released in advance to September this year, which means it is likely to be unveiled at the same time as iPhone 16. 1. According to the exposed renderings, the front design of iPhone SE4 is similar to that of iPhone 13, with a front camera and FaceID sensor on the notch screen. The back uses a layout similar to the iPhoneXr, but it only has one camera and does not have an overall camera module.

How big is the 1-inch sensor of a mobile phone? It's actually bigger than the 1-inch sensor of a camera How big is the 1-inch sensor of a mobile phone? It's actually bigger than the 1-inch sensor of a camera May 08, 2024 pm 06:40 PM

Yesterday's article didn't mention "sensor size". I didn't expect people to have so many misunderstandings... How much is 1 inch? Because of some historical issues*, whether it is a camera or a mobile phone, "1 inch" in the diagonal length of the sensor is not 25.4mm. *When it comes to vacuum tubes, there is no expansion here. It is a bit like a horse’s butt deciding the width of a railroad track. In order to avoid misunderstanding, the more rigorous writing is "Type 1.0" or "Type1.0". Moreover, when the sensor size is less than 1/2 type, type 1 = 18mm; and when the sensor size is greater than or equal to 1/2 type, type 1 =

nuScenes' latest SOTA | SparseAD: Sparse query helps efficient end-to-end autonomous driving! nuScenes' latest SOTA | SparseAD: Sparse query helps efficient end-to-end autonomous driving! Apr 17, 2024 pm 06:22 PM

Written in front & starting point The end-to-end paradigm uses a unified framework to achieve multi-tasking in autonomous driving systems. Despite the simplicity and clarity of this paradigm, the performance of end-to-end autonomous driving methods on subtasks still lags far behind single-task methods. At the same time, the dense bird's-eye view (BEV) features widely used in previous end-to-end methods make it difficult to scale to more modalities or tasks. A sparse search-centric end-to-end autonomous driving paradigm (SparseAD) is proposed here, in which sparse search fully represents the entire driving scenario, including space, time, and tasks, without any dense BEV representation. Specifically, a unified sparse architecture is designed for task awareness including detection, tracking, and online mapping. In addition, heavy

Let's talk about end-to-end and next-generation autonomous driving systems, as well as some misunderstandings about end-to-end autonomous driving? Let's talk about end-to-end and next-generation autonomous driving systems, as well as some misunderstandings about end-to-end autonomous driving? Apr 15, 2024 pm 04:13 PM

In the past month, due to some well-known reasons, I have had very intensive exchanges with various teachers and classmates in the industry. An inevitable topic in the exchange is naturally end-to-end and the popular Tesla FSDV12. I would like to take this opportunity to sort out some of my thoughts and opinions at this moment for your reference and discussion. How to define an end-to-end autonomous driving system, and what problems should be expected to be solved end-to-end? According to the most traditional definition, an end-to-end system refers to a system that inputs raw information from sensors and directly outputs variables of concern to the task. For example, in image recognition, CNN can be called end-to-end compared to the traditional feature extractor + classifier method. In autonomous driving tasks, input data from various sensors (camera/LiDAR

FisheyeDetNet: the first target detection algorithm based on fisheye camera FisheyeDetNet: the first target detection algorithm based on fisheye camera Apr 26, 2024 am 11:37 AM

Target detection is a relatively mature problem in autonomous driving systems, among which pedestrian detection is one of the earliest algorithms to be deployed. Very comprehensive research has been carried out in most papers. However, distance perception using fisheye cameras for surround view is relatively less studied. Due to large radial distortion, standard bounding box representation is difficult to implement in fisheye cameras. To alleviate the above description, we explore extended bounding box, ellipse, and general polygon designs into polar/angular representations and define an instance segmentation mIOU metric to analyze these representations. The proposed model fisheyeDetNet with polygonal shape outperforms other models and simultaneously achieves 49.5% mAP on the Valeo fisheye camera dataset for autonomous driving

Xiaomi 15 series full codenames revealed: Dada, Haotian, Xuanyuan Xiaomi 15 series full codenames revealed: Dada, Haotian, Xuanyuan Aug 22, 2024 pm 06:47 PM

The Xiaomi Mi 15 series is expected to be officially released in October, and its full series codenames have been exposed in the foreign media MiCode code base. Among them, the flagship Xiaomi Mi 15 Ultra is codenamed "Xuanyuan" (meaning "Xuanyuan"). This name comes from the Yellow Emperor in Chinese mythology, which symbolizes nobility. Xiaomi 15 is codenamed "Dada", while Xiaomi 15Pro is named "Haotian" (meaning "Haotian"). The internal code name of Xiaomi Mi 15S Pro is "dijun", which alludes to Emperor Jun, the creator god of "The Classic of Mountains and Seas". Xiaomi 15Ultra series covers

See all articles