Tesla knew Autopilot caused death, but didn’t fix it::Software’s alleged inability to handle cross traffic central to court battle after two road deaths

  • @gamer@lemm.ee
    link
    fedilink
    English
    82 years ago

    I remember reading about the ethical question about the hypothetical self driving car that loses control and can choose to either turn left and kill a child, turn right and kill a crowd of old people, or do nothing and hit a wall, killing the driver. It’s a question that doesn’t have a right answer, but it must be answered by anybody implementing a self driving car.

    I non-sarcastically feel like Tesla would implement this system by trying to see which option kills the least number of paying Xitter subscribers.

    • Ocelot
      link
      fedilink
      English
      52 years ago

      Meanwhile hundreds of people are killed in auto accidents every single day in the US. Even if a self driving car is 1000x safer than a human driver there will still be accidents as long as other humans are also sharing the same road.

      • @Oderus@lemmy.world
        link
        fedilink
        English
        52 years ago

        When a human is found to be at fault, you can punish them.

        With automated driving, who’s to punish? The company? Great. They pay a small fine and keep making millions while your loved one is gone and you get no justice.

        • @CmdrShepard@lemmy.one
          link
          fedilink
          English
          42 years ago

          People generally aren’t punished for an accident unless they did it intentionally or negligently. The better and more prevalent these systems get, the fewer the families with lost loved ones. Are you really arguing that this is a bad thing because it isn’t absolutely perfect and you can’t take vengeance on it?

    • Liz
      link
      fedilink
      English
      72 years ago

      At the very least, they would prioritize the driver, because the driver is likely to buy another Tesla in the future if they do.

    • @CmdrShepard@lemmy.one
      link
      fedilink
      English
      32 years ago

      I think the whole premise is flawed because the car would have had to have had numerous failures before ever reaching a point where it would need to make this decision. This applies to humans as we have free will. A computer does not.