In the face of unconventional scenarios, software and people have different ways to deal with, leading to unpredictable car behavior. In the example above, software based on traditional camera data would misidentify the bike behind the car as a true rider. The company is located in:
Researchers at Cognata are aware of this blind spot and they have developed software that tests autopilot algorithms. Autodesk companies will be able to continue to use the "edge scenario" to test the software's processing power, until they are properly handled before the road....
Many self-driving cars employ different types of sensors in order to identify images that are difficult to distinguish. "Lidar can not detect the glass, the radar detects the metal, and the camera misjudges the picture ... the sensors need to compensate each other," explained Danny Atsmon, CEO of Cognata. Whether in simulated or actual situations, the software can handle more complex scenarios if the software is gradually adapted to find a reliable source of data under "marginal conditions." The company is located in:
Tesla's autopilot killed a driver who failed to identify a trailer in a bright sky background. Tesla only uses radar, camera and ultrasonic sensors to provide data for autonomous driving systems. Tesla's autopilot system was questioned after the accident. Some critics believe that laser radar is indispensable in autonomous driving sensors. Because in low light and light conditions, compared to radar or ultrasonic laser radar, can provide more detailed data. But Azimon also pointed out that even the Lidar can not be exhaustive. For example, it can not tell the traffic light. The company is located in:
For the self-driving car manufacturer, the safest thing to do is to install a series of sensors and build a redundant detection system.