In the piece — titled “Can You Fool a Self Driving Car?” — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.

The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.

  • orcrist@lemm.ee
    link
    fedilink
    English
    arrow-up
    51
    arrow-down
    3
    ·
    22 hours ago

    They’re talking about avoiding legal liability, not about actually doing the right thing. And of course you can see how it would help them avoid legal liability. The lawyers will walk into court and honestly say that at the time of the accident the human driver was in control of the vehicle.

    And then that creates a discussion about how much time the human driver has to have in order to actually solve the problem, or gray areas about who exactly controls what when, and it complicates the situation enough where maybe Tesla can pay less money for the deaths that they are obviously responsible for.

    • jimbolauski@lemm.ee
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      3
      ·
      18 hours ago

      They’re talking about avoiding legal liability, not about actually doing the right thing. And of course you can see how it would help them avoid legal liability. The lawyers will walk into court and honestly say that at the time of the accident the human driver was in control of the vehicle.

      The plaintiff’s lawyers would say, the autopilot was engaged, made the decision to run into the wall, and turned off 0.1 seconds before impact. Liability is not going disappear when there were 4.9 seconds of making dangerous decisions and peacing out in the last 0.1.

      • FuglyDuck@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        8 hours ago

        The plaintiff’s lawyers would say, the autopilot was engaged, made the decision to run into the wall, and turned off 0.1 seconds before impact. Liability is not going disappear when there were 4.9 seconds of making dangerous decisions and peacing out in the last 0.1.

        these strategies aren’t about actually winning the argument, it’s about making it excessively expensive to have the argument in the first place. Every motion requires a response by the counterparty, which requires billable time from the counterparty’s lawyers, and delays the trial. it’s just another variation on “defend, depose, deny”.

      • michaelmrose@lemmy.world
        link
        fedilink
        English
        arrow-up
        14
        ·
        14 hours ago

        They can also claim with a straight face that autopilot has a crash rate that is artificially lowered without it being technically a lie in public, in ads, etc

      • FauxLiving@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        16 hours ago

        Defense lawyers can make a lot of hay with details like that. Nothing that gets the lawsuit dismissed but turning the question into “how much is each party responsible” when it was previously “Tesla drove me into a wall” can help reduce settlement amounts (as these things rarely go to trial).