Tesla knew Autopilot caused death, but didn’t fix it::Software’s alleged inability to handle cross traffic central to court battle after two road deaths

  • dub
    link
    fedilink
    English
    172 years ago

    A times B times C equals X… I am jacks something something something

    • @tool@lemmy.world
      link
      fedilink
      English
      172 years ago

      A times B times C equals X… I am jacks something something something

      Narrator: A new car built by my company leaves somewhere traveling at 60 mph. The rear differential locks up. The car crashes and burns with everyone trapped inside. Now, should we initiate a recall? Take the number of vehicles in the field, A, multiply by the probable rate of failure, B, multiply by the average out-of-court settlement, C. A times B times C equals X. If X is less than the cost of a recall, we don’t do one.

      Woman on Plane: Are there a lot of these kinds of accidents?

      Narrator: You wouldn’t believe.

      Woman on Plane: Which car company do you work for?

      Narrator: A major one.

      • @droans@lemmy.world
        link
        fedilink
        English
        82 years ago

        When you’re selling a million cars, it’s guaranteed that some of them will have a missed defect, no matter how good your QC is.

        That’s why you have agencies like the NHTSA. You need someone who can decide at what point the issue is a major defect that constitutes a recall.

  • harold
    link
    fedilink
    English
    12
    edit-2
    2 years ago

    but im saving the planet and making sure elon gets a cut of my money

  • @Nogami@lemmy.world
    link
    fedilink
    English
    5
    edit-2
    2 years ago

    Calling it Autopilot was always a marketing decision. It’s a driver assistance feature, nothing more. When used “as intended”, it works great. I drove for 14 hours during a road trip using AP and arrived not dead tired and still alert. That’s awesome, and would never have happened in a conventional car.

    I have the “FSD” beta right now. It has potential, but I still always keep a hand on the wheel and am in control of my car.

    At the end of the day, if the car makes a poor choice because of the automation, I’m still responsible as the driver, and I don’t want an accident, injury, or death on my conscience.

  • @fne8w2ah@lemmy.world
    link
    fedilink
    English
    42 years ago

    Yet Phoney Stark keeps on whinging about the risks of AI but at the same time slags off humans who actually know their stuff especially regarding safety.

  • golamas1999
    link
    fedilink
    English
    22 years ago

    Full Self Driving is such a sim name. The feature is level 2 advanced cruise control.

  • @Lucidlethargy@sh.itjust.works
    link
    fedilink
    English
    12 years ago

    Didn’t, or couldn’t? Tesla uses a vastly inferior technology to run their “automated” driving protocols. It’s a hardware problem first and foremost.

    It’s like trying to drive a car with a 720p resolution camera mounted on the license plate holder versus a 4k monitor on the top of the car. That’s not a perfect analogy, but it’s close enough for those not aware of how cheap these cars and their tech really is.

  • @gamer@lemm.ee
    link
    fedilink
    English
    12 years ago

    I remember reading about the ethical question about the hypothetical self driving car that loses control and can choose to either turn left and kill a child, turn right and kill a crowd of old people, or do nothing and hit a wall, killing the driver. It’s a question that doesn’t have a right answer, but it must be answered by anybody implementing a self driving car.

    I non-sarcastically feel like Tesla would implement this system by trying to see which option kills the least number of paying Xitter subscribers.

  • @_stranger_@lemmy.world
    link
    fedilink
    English
    0
    edit-2
    2 years ago

    There’s like three comments in here talking about the technology, everyone else is arguing about names like people are magically absolved of personal responsibilities when they believe advertising over common sense.

    • @chakan2@lemmy.world
      link
      fedilink
      English
      02 years ago

      Because the tech has inarguably failed. It’s all about the lawyers and how long they can extend Tesla’s irresponsibility.

  • @chakan2@lemmy.world
    link
    fedilink
    English
    02 years ago

    It’s time to give up the Tesla FSD dream. I loved the idea of it when it came out, and believed it would get better over time. FSD simply hasn’t. Worse, Musk has either fired or lost all the engineering talent Telsa had. FSD is only going to get worse from here and it’s time to put a stop to it.

  • PatFusty
    link
    fedilink
    English
    -12 years ago

    The driver was also not even paying attention to the road so the blame should be on him not the car. People need to learn that Tesla’s version of autopilot has a specific use case and regular streets is not that.

  • @BruinBears@lemmy.world
    link
    fedilink
    English
    -142 years ago

    I do agree the name and Teslas general advertising of drivers assists are a bit misleading.

    But this is really on the driver for not paying attention.

    • @Doug7070@lemmy.world
      link
      fedilink
      English
      282 years ago

      “A bit misleading” is, I think, a bit of a misleading way to describe their marketing. It’s literally called Autopilot, and their marketing material has very aggressively pitched it as a ‘full self driving’ feature since the beginning, even without mentioning Musk’s own constant and ridiculous hyperbole when advertising it. It’s software that should never have been tested outside of vehicles run by company employees under controlled conditions, but Tesla chose to push it to the public as a paid feature and significantly downplay the fact that it is a poorly tested, unreliable beta, specifically to profit from the data generated by its widespread use, not to mention the price they charge for it as if it were a normal, ready to use consumer feature. Everything about their deployment of the system has been reckless, careless, and actively disdainful of their customers’ safety.

      • @anlumo@feddit.de
        link
        fedilink
        English
        -102 years ago

        Everybody who has a bit of an idea what an autopilot in a plane actually does is not mislead. Do people really think that commercial airline pilots just hit the “autopilot” button in their cockpit after disengaging the boarding ramp and then lean back until the boarding ramp at the destination is attached?

        • @El_illuminacho@lemmy.world
          link
          fedilink
          English
          82 years ago

          Why do you think companies need to warn about stuff like “Caution, Contents are hot” on paper coffee shops? People are stupid.

          • @anlumo@feddit.de
            link
            fedilink
            English
            -42 years ago

            Those labels are there because people made a quick buck suing the companies when they messed up, not to protect the stupid customers.

            If the courts would apply a reasonable level of common sense, they wouldn’t exist.