First Self-Driving Car Fatality
GordonMcComb
Posts: 3,366
It took Uber to screw it up for everyone. One of their test self-driving cars, which had a human control riding in it but not driving at the time, killed a pedestrian earlier today.
http://money.cnn.com/2018/03/19/technology/uber-autonomous-car-fatal-crash/index.html
I have always said it takes just one bad incident to stop or greatly curtail new technology such as self-driving vehicles. The public's perception of technology is always tenuous. The "Buckminster Fuller effect" is when an emerging technology suffers a public relations fiasco. and irrepairably harms the public trust. In Fuller's case it was the Dymaxion car, which was involved in a fatal accident largely unrelated to its technology. The design of the car, and many elements of it, were just abandoned because of the bad publicity.
We all knew self-driving wasn't perfect, but it seems to me hubris played a large role in this. What's the point of having a human control during the tests if the human isn't going to intercede? IMO, Uber needs to go back to square one, take all of their cars off the road for foreseeable future, and let other developers of autonomous vehicles proceed with the caution that has so far kept this sort of accident from occurring.
http://money.cnn.com/2018/03/19/technology/uber-autonomous-car-fatal-crash/index.html
I have always said it takes just one bad incident to stop or greatly curtail new technology such as self-driving vehicles. The public's perception of technology is always tenuous. The "Buckminster Fuller effect" is when an emerging technology suffers a public relations fiasco. and irrepairably harms the public trust. In Fuller's case it was the Dymaxion car, which was involved in a fatal accident largely unrelated to its technology. The design of the car, and many elements of it, were just abandoned because of the bad publicity.
We all knew self-driving wasn't perfect, but it seems to me hubris played a large role in this. What's the point of having a human control during the tests if the human isn't going to intercede? IMO, Uber needs to go back to square one, take all of their cars off the road for foreseeable future, and let other developers of autonomous vehicles proceed with the caution that has so far kept this sort of accident from occurring.
Comments
Uber has previously grounded its vehicles while investigating a crash. In 2017, Uber briefly pulled its vehicles from roads after an Uber self-driving vehicle in Tempe landed on its side.
While I believe in the future of self-driven cars, MUCH more testing needs to be done, and their first rollout must be in controlled environments -- self-driven trams, busses on special bus lanes, inter-campus transportation, and so on. Even golf carts can be an effective test bed for autonomous vehicles, and far safer for the public.
I'll second this. It's the classic "walk before run" risk mitigation.
What they're saying now is that the woman was crossing the road walking her bicycle some 60 yards from an intersection -- IOW jaywalking. Human drivers know to watch for this sort of thing and might have slowed down. But (armchair engineering here), did the Uber vehicle have an effective AI for that scenario? Police are reporting evidence collected so far don't show "significant signs" that it slowed down.
Situations like someone suddenly crossing into traffic happen every day. Often they are hit, but aware drivers may swerve and brake to avoid a collision. You have to wonder what the Uber car was programmed for. I guess we'll find that out over time.
That seems typical in a lot of pedestrian incidents. Maybe we need self walking legs next.
Here's the thing: there's an average of slightly more than 1 fatality per 100 million driven miles. So far driverless cars have racked up maybe 10 million actual road miles, which is just a tenth of what's needed to even begin assessing the chances of fatal accidents.
All it takes is ONE fatality in the early years to set everything back, years maybe even decades. This is why I think it's wholly inappropriate to rush the technology onto open roads -- it's not what the developers think (remember Google still thinks it's okay to put a camera on your eyeglasses to record everything you see), it's the trust the public puts into it. It's all about the public buying into the concept, not the developers selling it.
-Phil
Not necessarily. Liability isn't a 0% or 100% thing. She may be at more than 51% fault, but that doesn't mean Uber won't be sued in civil court. Because of the negative publicity the most likely outcome is a confidential settlement. No trial attorney will let Uber slide on the fact that it only has about 2 million road miles under its belt. There's a strong argument that not enough testing was performed before the car was placed in public use. This is not a scenario Uber or anyone in the industry wants.
-Phil
LOL. After years of downtown driving dealing with headphone zombies, texting twits, and crazed cyclist I am warming to the idea. Perhaps a more appropriate response would be to enforce existing laws and enhance them as needed. A boost to employment and government income along with increased safety. What more could we ask for?
Or, double down on autonomous vehicles. If there are more autonomous vehicles on the road, things will resolve themselves one way or the other! :-D
-Phil
Then again, there are the Three Laws of Robotics to contend with.
Ah, but that only works if the robot knows its action (or inaction) can affect the outcome. As mentioned earlier, even robots can't circumvent the laws of physics. So, now, if a robot must take into account the possibility of violating one of the laws in the future for an action taken in the past, then the robot will never take any action at all. Unless "taking no action" is itself considered an action, in which case, the robot will go insane. Not too surprisingly, Asimov delved into this scenario as well (the short story "Liar!" comes to mind).
1.Hitchhiker 25: (41) Deep Thought paused for a moment's reflection. (42) "Tricky," he said finally."
How many pedestrians just walk out without looking? I have seen tons, mostly the younger ones. They make sure they don't look at you when you slam on the brakes. Many have earplugs and phones out. And who do they blame if they get hit?
BTW A number of cars have had auto breaking for a while now.
A number of manufacturers have announced all their current models are now fully upgradable to self-driving. Lane departure warning, blind spot warning, auto braking, etc are often now standard in reasonable models now. I happen to know Volvo are very well advanced, and VW have had lots of extras for some time too. Mazda MX9 has all these too.
I think some of these "entitled" oblivious kids are hoping to get hit by a car and get a cash settlement.
Sure beats working.
"you mean to say people and cars moved within feet of each other and there were no barriers!!!- unbelieveable!- sure were primitive back then".
Dave
None of these vehicles should be classified as self-driving. There should be a name for what we have now.
SSD (Semi Self Driving)?
A person does not 'jump out' in front of the car, they are walking across multiple lanes of traffic.
Dashcam video: It is nighttime, car is probably doing 45mph on a very, very well lit road. Moron is walking a bike across multiple lanes of traffic. Despite it being a major road with well-beamed cars roaring at them, he is looking in opposite direction.
Because it is a well lit street and not a dark 70 mph freeway, any one of us would have easily seen the person and slowed down/stopped. Perhaps this is why the pedestrian felt entitled to be acting like this in the first place.
Maybe the video SORTA looks 'dark' because of the low res video you saw.
-Phil