An interesting read on the state of AI systems and self driving cars.
Quote:
Recently a driverless shuttle was involved in a crash because it could not understand the intent of a human driver of a tractor trailer ahead of it who very slowly backed up for more maneuvering room, expecting the shuttle to also back up (NTSB 2019). The tractor trailer driver did not know the shuttle had no driver (nor did it have the ability to operate in reverse).
The driver had an expectation built over years of experience that the other vehicle would give way and be able to reverse, but the shuttle had no rule set to reference. This scenario seems simple for human drivers who understand the need to negotiate to resolve uncertainty but such abstract principles and the development of alternative action plans, even simple ones, is outside of the realm of ML-enabled systems, at least for the foreseeable future.
Such ambiguous situations happen regularly in the driving domain and often with much more dramatic and deadly consequences. There have been several incidents where Tesla drivers have been killed while driving on Autopilot, an automated driving assist feature, which failed to see objects directly in cars’ paths, and a pedestrian has been killed by an Uber self-driving car while undergoing human-supervised testing (Crowe 2016, Griggs and Wakabayashi 2018, Lohr 2016). In all these cases, the skill-based reasoning automated systems that relied on bottom-up processing failed, and deaths occurred because the inattentive drivers did not realize these cars still needed their top-down reasoning and judgment.
|
https://hal.pratt.duke.edu/sites/hal...compressed.pdf
FSD 9 has been released, this one has radar removed and relies entirely on vision. You can see in the above video at ~16:10, first where it looks confused, and second, where it doesn't see the concrete pylons at all, and almost hits them. Radar would have detected these and avoided them, but they appear invisible to the cameras.
It's also hilarious that these clowns are out testing it, and say "this is the perfect situation to test late at ngiht with drunk people all over not following rules" or something like that. No, that's how you run someone over. It's kind of amazing any Joe blow can beta test a clearly unfinished safety product late at night in unpredictable situations, and random bystanders are the test subjects, with no say in it themselves.