Yesterday the world changed. The first human, Elaine Herzberg, died as a result of an autonomous vehicle, travelling in autonomous mode. We all kind of knew this day may arrive, but hoped it wouldn’t.
Understandably, there’s several investigations underway to determine how this could occur and ultimately who’s at fault. Here’s what we know so far.
In Tempe, Arizona, the death was caused by one of Uber’s self-driving fleet Volvo SUV’s. At the time of the incident the car was in autonomous mode, however did have the required human behind the wheel and no passengers were present. The woman was crossing the road (not on a crossing) with her bike and shopping bags, when struck by the car, which resulted in her suffering injuries which caused her to later pass away.
The woman reportedly walked out from a center median abruptly into a lane of traffic.
Police chief in Tempe, Ariz, Sylvia Moir said,
“The driver said it was like a flash, the person walked out in front of them.
His first alert to the collision was the sound of the collision.
Traveling at 38 mph in a 35 mph zone on Sunday night, the Uber self-driving car made no attempt to brake, according to the Police Department’s preliminary investigation.
Since then the police department has backed away from those claims, moving to a more neutral position where they’ll wait for the results of the full investigation.
That incident spawns so many questions.
It is inviting to imagine that all self-driving vehicles work in the same way, but in reality that’s not true. Each company working on this technology, goes about the challenge in different ways, using different hardware sensors, different software, written by different people. Each company is making a judgement about how confident they are in that combination of technology and talent as to how well their vehicle can navigate the world.
So now for the questions. Why didn’t Uber’s sensors detect the lady crossing ahead and apply the brakes? If the technology failed, why didn’t the human behind the wheel intervene? Did the lady play a part in causing the accident. How do you enable a car to drive past people standing on the side of the road without imagining they could step out at any time and applying the brakes?
There are so many questions.
In the future, self-driving or autonomous cars won’t require a human driver at all, so the technology has to be able to understand the environment around it and respond accordingly. People will always do weird things and the cars need to accommodate that. Humans will cross the road in places where there aren’t marked crossings and be carrying or pushing objects that may look like something else.
For now, Uber has halted the operation of its entire self-driving fleet which were operating in Phoenix, Pittsburgh, San Francisco and Toronto. The reason you do that is because the same software is likely running on all of them and if it happened once, there’s a chance it could happen again.
What we shouldn’t do is level the anger about this incident on all self-driving cars, as I mentioned, that are all build differently, despite the common goal. What should happen is that Uber release as much information as possible as fast as possible and let the industry understand the cause, or likely causes of the incident were.
I love the idea of driverless cars and the opportunities that will afford to society going forward. There can’t be cowboys though, this needs to be done safely and the benchmark for software programmers is much, much higher than for human drivers.
This is one story where the facts are incredibly important and only with time and the results of a full investigation from Uber and the authorities will determine where fault ultimately lies.