Fatal Tesla Autopilot Crash
I only heard about this today from this article fragment: https://www.roboticsbusinessreview.com/tesla-crash-aftermath-will-data-show/
Should be interesting to see what comes of it.
Should be interesting to see what comes of it.
Comments
The early and well publicized Dymaxion accidents, some of which were not caused by the Dymaxion itself, forever tainted the concepts of the car, blunted research into similar designs, and some have said even soured the public regarding Fuller's other designs.
The world -- and legal landscape -- is not ready for production versions of self-driving cars. Too much has to be ironed out. For example, if a self-driving car kills a pedestrian, who pays? There is no legal precedent as guidance, and there's even the possibility the driver's insurance would not pay claims based on a potential fault from the vehicle itself. Talk about a legal mess.
For the foreseeable future, the technology should be limited to adapting the research for use in added safety features, and for limited-exposure applications, such as self-driving carts and vehicles not driven on public roads (except for test vehicles, like Google's).
All you have to do is join RBR for the rest of the article. A conspicuous bargain at just $795!
https://www.roboticsbusinessreview.com/join/
Maybe @Heater can get his boss to buy a subscription and then share the article with us here?
When that's said, it's probably not a good idea to install a driver support system which makes every road feel like it's a never ending straight road, with nothing to do for the driver except sitting there with hands on the wheel. Nothing is as dangerous as a boring road.
Anyway you just can't expect the general public to do anything right. They may very well use this to demonize autonomous cars 5 years from now.
We just have to hold our breath and hope it doesn't become a partisan issue eventually.
Some excerpts:
Embedded in that question is whether self-driving cars would have controls needed to let a passenger become a driver. Seems like we should never eliminate the option of hitting brakes, kind of like we can do to override cruise control.
Some experts say otherwise. Patrick Lin, who looks at such things at Cal Poly, told IEEE (“the world’s largest technical professional organization dedicated to advancing technology for the benefit of humanity”), “Humans aren’t hardwired to sit and monitor a system for long periods of time and then quickly react properly when an emergency happens.”
And Google’s Chris Urmson, director of that firm’s self-driving car program, told the U.S. Senate in March that Google is working toward taking humans out of the driving process, because they “can’t always be trusted to dip in and out of the task of driving when the car is encouraging them to sit back and relax.”
In addition to legal questions, there are ethical ones, pointedly pointed out in a Science magazine article: “Autonomous vehicles should reduce traffic accidents, but they will sometimes have to choose between two evils, such as running over pedestrians or sacrificing themselves and their passenger to save the pedestrians.”
And my favorite:
Here’s another potential problem, raised by Ben Wear, our esteemed transportation writer. Two self-driving cars arrive simultaneously at the supermarket. There’s only one parking spot. Hmm. Would the vehicles be programmed to make obscene gestures toward each other?
We have insurance so that when we're in an accident, even when it's not our fault, all or most of the financial burden and lawsuits fall to the insurance companies. A car-owner's insurance may reasonably reject a claim by another party if they believe the fault is a design flaw in the car. That leaves the car owner to face all of the burden, possibly having to shell out medical, property, and legal fees, and in turn, being forced to sue the car maker in response, where there is no guarantee of success. Resolution can take years.
There are no specific laws that govern the liability share of driverless cars, or legal precedent to guide courts. It can take years to develop a body of laws that protect consumers, during thistime, owners of driverless vehicles ride at their own risk. All the while, you can bet that the carmakers will reject attempts to make them the responsible party in the event of a crash. Insurance is a game of spreading out the risk; here, it would all be concentrated. From a financial standpoint, that will not work.
http://nbcnews.to/2ifE3Mh
EDIT: This should be an aftermarket addition to any vehicle, if the tech works, I would be happy with just a warning light. I'm not interested in the vehicle doing all the work.