kibwen 21 hours ago

No, rather, if the manufacturer of the self-driving software doesn't take full legal liability for actions taken by the car, then it's not autonomous. This is the once and final criterion for a self-driving vehicle.

  • Sohcahtoa82 21 hours ago

    Sounds like we're in agreement then.

    Right now, Tesla skirts legal liability by saying that the driver needs to watch the road and be ready to take control, and then uses measures like detecting your hand on the wheel and tracking your gaze to make sure you're watching the road. If a car driving on FSD crashes, Tesla will say it's the driver's fault for not monitoring the drive.

    • FireBeyond 18 hours ago

      Hell, they'll even hold a press conference touting (selective data from) telemetry to say "The vehicle had been warning him to pay attention prior to the accident!"

      And then four months later when the actual accident investigation comes out, you'll find out that yes, it had. Once. Eighteen minutes prior to the accident.

      And then to add insult to that, for a very long time, Tesla would fight you for access to your own telemetry data if, for example, it was needed in a lawsuit against someone else for an accident.

kjkjadksj 17 hours ago

That is for the lawyers not indicative of capability

  • JumpCrisscross 15 hours ago

    I’ve taken a nap in my Waymos. One can’t in a Tesla. That is a difference in capability.