Tesla CEO Elon Musk has bombarded the technical limitations of Lidar as a self-driving car sensor on the company's Autopilot Open Day: "This is an extremely expensive and useless technology." When the remarks came out, the industry was stunned, and the status of the three automatic driving sensors, such as millimeter wave radar, vision system (camera) and laser radar, had to be rewritten.
Automated driving requires multi-sensor fusion applications
The safe driving of an autonomous vehicle requires three necessary and critical steps: first, the sensing phase, which acquires and collects information about the external environment and the vehicle's own operating conditions; then, enters the decision-making phase and issues driving instructions, mainly relying on The cloud algorithm of various control chips; finally, the execution phase, "putting the vehicle signal of the front end decision" into action. It can be seen that the perception stage is the initial guarantee for the safe driving of autonomous vehicles, just like the "eyes and ears" of the car.
Due to the complexity of the road environment and the driving conditions of the vehicle, the single-function sensor cannot meet the vehicle information that changes in real time during driving. Yi Jihui, vice president of global market and application engineering at ON Semiconductor's IntelliSense Division, said that sensor fusion (including vision systems, millimeter wave radar and lidar) and sensor depth perception are the future trends of autonomous vehicle sensing systems.
Can the laser radar be replaced?
In order to reduce the cost of the laser radar part, some auto companies represented by Tesla simply pulled the laser radar into the blacklist and developed other technical routes. The reporter learned from Tesla's official website that a Tesla standard has a total of 8 cameras, a 77GHz millimeter wave radar and 12 ultrasonic radars. Lu Wenliang, general manager of CCID Consulting's Automotive Industry Research Center, said that although there is no laser radar in the Tesla configuration list, Tesla will try to make other sensors better, and at the same time, it must be more powerful. The back-end sensor chip and processing chip.
It can be seen that Musk's "Lidar idiot theory" appears in this moment of time is not accidental, and the sensor is indeed in the direction of rethinking. Jin Hao, general manager of the Automotive Radar Division of Beijing Science and Technology Lei Ke Electronic Information Technology Co., Ltd., said in an interview with China Electronics News that it will take time to reduce the cost of laser radar. In research institutions, laser radar will also be the main environmental sensing sensor in the short term. However, in response to business-oriented companies, choosing other technology routes to avoid laser radar will become a trend.
Jin Hao said that if the cost of laser radar can fall below the thousandth order in the future, the possibility of becoming an unmanned core sensor will increase greatly. However, reducing the cost of laser radar is the biggest challenge. Tesla seems to have completely abandoned the use of lidar, trying to collect data through a multi-camera solution, and then training the neural network with a simulator that restores the actual environment, and realizes the “cognition” of the vehicle on the traffic road condition through intelligent vision, but only the degree of reliability. It is difficult to draw conclusions.
Improving sensor performance is the only way to develop autonomous driving
Tesla is proud of its autonomous driving mechanism, largely due to the good visuals. It is understood that its visual device has a total of 8 fisheye, normal and telephoto cameras: including a reversing camera applied to the rear of the car, a front three-eye camera and two side-view cameras on both sides, side front and side The rear view overlaps to avoid visual blind spots, which can basically guarantee the functions of the lane change, confluence and high speed of the L3 level of Tesla.
Guo Yuansheng, deputy director of the Central Science and Technology Committee of the Jiu San Society and vice chairman of the China Sensors and Internet of Things Industry Alliance, said that the camera still has a lot of room for improvement in terms of dynamic range and near-infrared sensitivity. The bicycle needs more and better cameras. . Jin Wei said that the multi-eye stereo camera is the development trend of the future sensor vision system. It is expected to combine the advantages of laser radar and camera. It has the characteristics of high-density distance point cloud of laser radar to directly extract obstacles and accurately measure the distance. The ability to visually recognize and machine learning.
Second, more signal processing algorithms and new radar technologies will be ported to the development of millimeter-wave radars to provide a more sensitive sensing system for detecting obstacles. Guo Yuansheng said that in recent years, the millimeter-wave radar module used in automobiles contains a number of chips based on different processes, which is a relatively clumsy system and has a high cost. In pursuit of smaller size and lower cost, countries are working on the development and integration of multi-function components and standardization of common radar chipsets.
Yi Jihui said that SPAD technology (single photoelectric avalanche diode) will be combined with millimeter wave radar, which is expected to help the millimeter wave radar to greatly increase the sensitivity multiple. On this basis, Jin Hao also said that the future development direction of millimeter wave radar will be to develop point cloud radar, to obtain high-resolution point cloud imaging capability at a low price, gradually replacing laser radar and developing more application scenarios.
Finally, if the Lidar's precision detection capabilities, high resolution and other performance are further enhanced, and even combined with the functions of other sensor components, laser radar is expected to challenge more complex autopilot environments and other more challenging applications. Yi Jihui said that image sensing technology can help the development of laser radar. It is reported that researchers at the University of Shanghai for Science and Technology have recently developed a new laser-based sensing system capable of capturing objects 45 kilometers (28 miles) in a smoggy urban environment. The technology uses a single photon detector combined with a unique algorithm to "weave" the most sparse data points together to produce ultra-high resolution images. The new lidar vision technology significantly enhances the limitability of diffraction and is expected to open up new areas for high-resolution, fast, low-power 3D optical imaging over long distances.