Report: Tesla is bleeding talent from its Autopilot division

Discussion in 'Motorized Vehicles' started by hmscott, Aug 25, 2017.

  1. killkenny1

    killkenny1 Too weird to live, too rare to die.

    Reputations:
    7,536
    Messages:
    4,476
    Likes Received:
    8,713
    Trophy Points:
    581
    So wait, the dude knew there were problems with autopilot, but still continued to use it?
    Okay...
     
    hmscott likes this.
  2. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    4,882
    Messages:
    17,073
    Likes Received:
    20,974
    Trophy Points:
    931
    Elon Musk says Tesla's autopilot system will "never be perfect"
    Published on Apr 13, 2018
    Federal investigators forced Tesla out of a probe of a deadly crash in California last month involving the electric carmaker's autopilot system. The driver died after crashing into a highway barrier. "CBS This Morning" co-host Gayle King experiences the autopilot system in the Tesla Model 3 and asks CEO Elon Musk about the safety concerns surrounding the technology.
     
  3. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    478
    Messages:
    3,116
    Likes Received:
    3,470
    Trophy Points:
    331
    It seems like there's a bit double standard in public perception of autopilot, where we seem to accept an average of more than 100 people killed daily in human accidents as normal but if a machine ever fails at all it's headline news. It makes automated vehicles seem way more dangerous than they probably are.
     
    Convel and hmscott like this.
  4. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    4,882
    Messages:
    17,073
    Likes Received:
    20,974
    Trophy Points:
    931
    It's a bit more nuanced than you make it appear.

    I am not allowing a software device I know can't handle the exceptions a human can to take over driving a high speed vehicle, driving among other cars and pedestrians, unless I know for a fact that all the exceptions have been covered, and the death exceptions allowed are known and clearly stated.

    Right now the software is so bad it can have an unreported death bug for months, maybe years, pulling drivers over to crash into a barrier - the same barrier in the same location - consistently, and since the human - for the most part apparently - pulls the wheel over in time to keep the software from killing them, and doesn't report the bug, it can kill someone without exception if left unattended, in AutoPilot.

    AFAIK that bug is still in the Tesla Autopilot and another person could get killed at or near the same spot, and similar spots across the country.

    There is no way that software is going to be ready to earn the name AutoPilot, for many many years.

    Until then, humans should drive and be in control of their vehicle, and stop killing people with this silly idea that the car is going to drive them and they can push their noses back in their phone screens and go back to ignoring reality.

    It's really a sad joke on us all, and I wish someones in power would wise up and disable these things completely, and take them off the road.
     
    Convel likes this.
  5. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    478
    Messages:
    3,116
    Likes Received:
    3,470
    Trophy Points:
    331
    Common sense rhetoric may keep me from driving with my nose in a phone, tired, distracted, drunk, or otherwise impaired, but it does nothing to stop others from hitting me when they don't take that advice. Maybe past experience with people who refuse to listen to reason about such things has colored my opinion some but if someone impaired does manage to get behind the wheel and I can trust their car more than I can trust them not to hit me, even if it's just a little bit, then I'll take my chances with their car driving over hoping they take responsibility and not drive because we both know that is not going to happen. I'm all for more work in it before more widespread adoption but it seems there's a general undertone to the whole thing (public opinion, reporting, etc) that seems like the actual problems people have with the technology are something besides the risk. It's hard to articulate and not limited to car technology, but it's still bothersome that it's approached the way it is to me.
     
    Convel and hmscott like this.
  6. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    4,882
    Messages:
    17,073
    Likes Received:
    20,974
    Trophy Points:
    931
    For those few situations, call a cab, you don't need to put your life in the hands of poorly conceived software.

    As it stands now that impaired person would have died the same was as the guy with his hands off the wheel, because he couldn't react in time to pull away from the barrier.

    After some thought, you'll come to the same conclusion, perhaps while you update Windows 10. ;)

    Software is fallible because people are fallible, and if the software relies on a human to constantly have their hands on the wheel and the human needs to pay the same level of attention to "assisting the AI" as they do for driving themselves, to be able to react in time to take control from the software, then what the hell good is the software, turn it off and drive yourself. :)

    It's a boondoggle to get money and sell magic cars.

    No more, and probably a lot less.

    And that "magic" is what fools people into thinking AutoPilot does what they think it means, and it doesn't.

    Recall the software, rethink the idea, and make nice electric cars.
     
  7. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    478
    Messages:
    3,116
    Likes Received:
    3,470
    Trophy Points:
    331
    I didn't say it exactly in my previous post but to apply my first sentence to yours: I already would call a cab. Telling me to do something I already do doesn't help, and telling others has no effect on whether they actually do, since cabs have existed as long as cars have and impaired driving is still a problem. Clearly the reasons people drive impaired are not related to whether it's a good idea or not and not related to whether there are other options or not, or after some thought they would come to the same conclusion and not do it. I can operate my own vehicle and plan to continue to do so because I don't need a car to drive for me, but I will continue having an effect oriented view on the tech. Either automated cars have a decreased fatality rate per mile or they don't. While that is being determined, making each death as high profile as possible helps no one.
     
    hmscott likes this.
  8. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    4,882
    Messages:
    17,073
    Likes Received:
    20,974
    Trophy Points:
    931
    Sure it does, what do you want to do hide the deaths? Not report them, not make people aware of the software's "blind" spots so they will repeat the same mistakes and die too?

    Of course you have to make it known, and make it clear how it happened - debugged to the root cause and contributing factors, how else will we all learn to be safe driving them?

    Well, I know how to be safe driving them. Don't turn on the automation.

    People need to wake from the thought that this software will drive you safely from place to place without your attention being as focused on keeping the drive safe as if you were driving it yourself.

    That's the only way to be safe, keep your hands on the wheel and be ready to react against the direction the automation takes - along with all the other surprises that come up - don't wait for the automation to "fix" the problem situation you see coming, take action yourself immediately ahead of the automation, save yourself.
     
  9. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    478
    Messages:
    3,116
    Likes Received:
    3,470
    Trophy Points:
    331
    When I say don't take it to one extreme I am not saying to take it to another extreme. Find some other way than making each incident a spectacle.

    Again, driving responsibly is something I already do, and while I'm saving myself from myself, it's not doing anything to save myself from others or them from themselves or each other. Human error and impairment is something that having had since the first fatal car accident to consider it, we have not yet managed to solve by telling people that they need to drive safer. I'm not saying to stop doing that to just throw all responsibility for safety on automation, but something more is required.
     
    hmscott likes this.
  10. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    4,882
    Messages:
    17,073
    Likes Received:
    20,974
    Trophy Points:
    931
    You are still missing the point that no matter how good it gets, it's still going to fail, and it's still not going to be perfect.

    So the human driver is still going to need to have hands on and paying strick attention - not reading a book, or playing a game on their phone, but eye's on road with physical control ready to be taken at any time.

    So what's the point, if I can't rely on it 100%, and I have to not only react for road hazard exceptions I also have to react to "imaginary exceptions created by the failed software" for a total of 2x+ problems, what's the point of putting the software between me and my safety?

    Anyway, lets see how it progresses, hopefully someone will point to the 1% of fail instead of the 99% of "oh wow, it's driving itself", so none of us need to experience "oh my god, I'm gonna...", again.
     
    Last edited: Apr 13, 2018
Loading...

Share This Page