Shop OBEX P1 Docs P2 Docs Learn Events
Fatal Tesla Autopilot Crash — Parallax Forums

Fatal Tesla Autopilot Crash

ercoerco Posts: 20,255
I only heard about this today from this article fragment: https://www.roboticsbusinessreview.com/tesla-crash-aftermath-will-data-show/

Should be interesting to see what comes of it.

Comments

  • Can't read the whole article :/
  • I fear "the Buckminster Fuller effect." It's when a good idea is demonstrated too early for the technology to keep up. This was the case with Fuller's Dymaxion, a very interesting concept that the technology of the day couldn't cope with. Today, computers could compensate for the interesting engineering challenges. There was nothing like that in the 30s.

    The early and well publicized Dymaxion accidents, some of which were not caused by the Dymaxion itself, forever tainted the concepts of the car, blunted research into similar designs, and some have said even soured the public regarding Fuller's other designs.

    The world -- and legal landscape -- is not ready for production versions of self-driving cars. Too much has to be ironed out. For example, if a self-driving car kills a pedestrian, who pays? There is no legal precedent as guidance, and there's even the possibility the driver's insurance would not pay claims based on a potential fault from the vehicle itself. Talk about a legal mess.

    For the foreseeable future, the technology should be limited to adapting the research for use in added safety features, and for limited-exposure applications, such as self-driving carts and vehicles not driven on public roads (except for test vehicles, like Google's).

  • ercoerco Posts: 20,255
    DavidZemon wrote: »
    Can't read the whole article :/

    All you have to do is join RBR for the rest of the article. A conspicuous bargain at just $795! :)

    https://www.roboticsbusinessreview.com/join/

  • erco wrote: »
    DavidZemon wrote: »
    Can't read the whole article :/

    All you have to do is join RBR for the rest of the article. A conspicuous bargain at just $795! :)

    https://www.roboticsbusinessreview.com/join/

    Maybe @Heater can get his boss to buy a subscription and then share the article with us here?
  • The public just isn't smart enough to realize this is only a partial autopilot, and the semi caused the accident, which was driven by a person.
  • TorTor Posts: 2,010
    And according to what's written about the particular driver, he had in the past posted videos of himself using the car as if it was an autonomous car, letting it drive itself. It has even been questioned if he was watching a dvd at the time. That auto-pilot system still expects the driver to have both hands on the steering wheel, according to Tesla.
    When that's said, it's probably not a good idea to install a driver support system which makes every road feel like it's a never ending straight road, with nothing to do for the driver except sitting there with hands on the wheel. Nothing is as dangerous as a boring road.
  • Yeah it's pretty obvious people might be nervous at first, but eventually they just stop paying attention once they see how well the auto pilot works.

    Anyway you just can't expect the general public to do anything right. They may very well use this to demonize autonomous cars 5 years from now.

    We just have to hold our breath and hope it doesn't become a partisan issue eventually.
  • There was an interesting article in today's Austin paper discussing this, and the 'driverless' car. The article is at Driverless cars and the end of the driver’s license?

    Some excerpts:
    Embedded in that question is whether self-driving cars would have controls needed to let a passenger become a driver. Seems like we should never eliminate the option of hitting brakes, kind of like we can do to override cruise control.
    Some experts say otherwise. Patrick Lin, who looks at such things at Cal Poly, told IEEE (“the world’s largest technical professional organization dedicated to advancing technology for the benefit of humanity”), “Humans aren’t hardwired to sit and monitor a system for long periods of time and then quickly react properly when an emergency happens.”
    And Google’s Chris Urmson, director of that firm’s self-driving car program, told the U.S. Senate in March that Google is working toward taking humans out of the driving process, because they “can’t always be trusted to dip in and out of the task of driving when the car is encouraging them to sit back and relax.”

    In addition to legal questions, there are ethical ones, pointedly pointed out in a Science magazine article: “Autonomous vehicles should reduce traffic accidents, but they will sometimes have to choose between two evils, such as running over pedestrians or sacrificing themselves and their passenger to save the pedestrians.”


    And my favorite:
    Here’s another potential problem, raised by Ben Wear, our esteemed transportation writer. Two self-driving cars arrive simultaneously at the supermarket. There’s only one parking spot. Hmm. Would the vehicles be programmed to make obscene gestures toward each other?
  • Or what about the pedestrian problem. Here there's a law that gives them the right of way over turning cars. They ignore it.
  • On pedestrians, or anyone hit by a driverless car: the real issue is the insurance safety net.

    We have insurance so that when we're in an accident, even when it's not our fault, all or most of the financial burden and lawsuits fall to the insurance companies. A car-owner's insurance may reasonably reject a claim by another party if they believe the fault is a design flaw in the car. That leaves the car owner to face all of the burden, possibly having to shell out medical, property, and legal fees, and in turn, being forced to sue the car maker in response, where there is no guarantee of success. Resolution can take years.

    There are no specific laws that govern the liability share of driverless cars, or legal precedent to guide courts. It can take years to develop a body of laws that protect consumers, during thistime, owners of driverless vehicles ride at their own risk. All the while, you can bet that the carmakers will reject attempts to make them the responsible party in the event of a crash. Insurance is a game of spreading out the risk; here, it would all be concentrated. From a financial standpoint, that will not work.
  • In this day and with roadways as they are letting a machine do the driving is natures way of thinning the shallow end of the gene pool.
  • MikeDYurMikeDYur Posts: 2,176
    edited 2016-12-29 02:43
    Possibly a fatal crash avoided, to bad for the vehicles in front.


    http://nbcnews.to/2ifE3Mh


    EDIT: This should be an aftermarket addition to any vehicle, if the tech works, I would be happy with just a warning light. I'm not interested in the vehicle doing all the work.
Sign In or Register to comment.