Autonomous cars supposedly have a great safety record. They do not make mistakes. This frequently heard assertion is hardly comforting after viewing the video below.
Volvo explained the accident by stating that the autonomous system involved was not designed to avoid pedestrians (that’s a more expensive option than the one installed in the car in the video). In other words, it’s the fault of the people, not the car.
Volvo’s explanation that the people did not understand the nature of the automatic avoidance system seems plausible, but it raises a concern. Autonomous cars (which we have argued are really Unmanned Ground Vehicles) will not happen all at once. Self-driving subsystems will be adopted piecemeal over the years. Will future passengers/drivers understand what capabilities are and are not autonomous in their cars? Should we expect more videos like this? What is the responsibility of the car company to educate its clients?
This incident reminds me of airplane calamities. Have you ever noticed how often airplane disasters are blamed on “pilot error”? I have always had the suspicion that airlines fault pilots for big crashes, because to acknowledge mechanical failure (which could be caused by inadequate maintenance or overuse) may leave them vulnerable to greater legal liability.
As I wrote above, Volvo’s explanation is believable, but the similarity to the overuse of “pilot error” is unsettling.