rurp a day ago

The company selling the car is adamant that none of their cars are fully autonomous in every single legal or regularity context. Any accident caused by the car is 100% the fault of the driver. But the company markets their cars as fully autonomous. That's pretty much the definition of pretending to be autonomous.

nutjob2 a day ago

It's a level 2 system, it can't be operated unattended. Your friends are risking thier lives as several people (now dead) have found out.

  • kjkjadksj 18 hours ago

    I think we are at the point where the data suggests they bear more risk when they drive the tesla themselves. See the bloomburg report on accidents per mile.

  • bananalychee a day ago

    Wikipedia lists two fatal crashes involving Tesla FSD and one involving Waymo.

    • codeka 17 hours ago

      > one involving Waymo

      Are you referring to the one where a Waymo, and several other cars, were stopped at a traffic light, when another car (incidentally, a Tesla) barreled into the traffic stack at 90 MPH, killing several people?

      Because I am not aware of any other fatal accidents where a Waymo was even slightly involved. I think it's, at best, misleading to refer to that in the same sentence as FSD-involved fatalities where FSD was the direct cause.

    • jedberg a day ago

      They key difference is that the Teslas killed their passengers, the Waymo hit someone outside the car (and it wasn't the Waymo's fault, it was hit by another car).

      • Animats a day ago

        Yes. [1] That incident got considerable publicity in the San Francisco media. But not because of the Waymo.[2][3]

        Someone was driving a Tesla on I-280 into SF. They'd previously been involved in a hit-and-run accident on the freeway. They exited I-280 at the 6th St. off ramp, which is a long straightaway. They entered surface streets at 98 MPH in a 25 MPH zone, ran through a red light, and reached the next intersection, where traffic was stopped at a red light. The Tesla Model Y plowed into a lane of stopped cars, killing one person and one dog, injuring seven others, and demolishing at least six vehicles. One of the vehicles waiting was a Waymo, which had no one on board at the time.

        The driver of the Tesla claims their brakes failed. "Police on Monday booked Zheng on one count of felony vehicular manslaughter, reckless driving causing injury, felony vandalism and speeding."[2]

        [1] https://www.nbcbayarea.com/investigations/waymo-multi-car-wr...

        [2] https://www.sfchronicle.com/sf/article/crash-tesla-waymo-inj...

        [3] https://www.youtube.com/watch?v=ULalTHBQ3rI&

      • bananalychee a day ago

        The question should be less who was at fault and more would a human driver have reacted better in that situation and avoided the fatality. I'm not sure why you think that whether the fatality occurred inside or outside of the car changes the calculus, but in that case only one of the two documented Tesla FSD-related fatalities killed the driver. Judging by the incident statistics of Tesla's Autopilot going back over half a decade, I'm pretty sure it's significantly safer than the average human driver and continues to improve, and the point of comparison in the original post was with human driving rather than Waymo. I have no doubt that Waymo, with its constrained operating areas and parameters, is safer in aggregate than Tesla's general-purpose FSD system.

      • bananalychee a day ago

        Only one of the two, and it's not nearly enough data to draw a conclusion one way or another in any case.

      • [removed] a day ago
        [deleted]
    • nostrademons a day ago

      Wikipedia lists at least 28 fatal crashes involving Tesla FSD:

      https://en.wikipedia.org/wiki/List_of_Tesla_Autopilot_crashe...

      • bananalychee a day ago

        FSD is not Autopilot despite the names being conflated today, but even if you want to count all 28, it's not enough to compare raw numbers of fatal incidents without considering the difference in scale. That's not to justify taking your eyes off the road when enabling FSD on a Tesla, but the OP did not suggest that either anyway.

        • Fricken a day ago

          If Waymo were operating at 1000 times the scale then I suppose their total fatalities would be somewhere in the ballpark of 0 x 1000.

    • JumpCrisscross 16 hours ago

      There is no world in which New York lets Teslas drive autonomously in the next decade. Had they not been grandfathered in in California, I doubt politics there would have allowed it either.

  • boppo1 a day ago

    Sources? Havent heard of deaths except total idiots sleepping at 80mph.

    • runako a day ago

      If the car needs any occupant to be awake, it is not an autonomous vehicle.

      Some of the best marketing ever behind convincing people that the word "autonomous" does not mean what we all know it means.

    • Dylan16807 a day ago

      Are you trying to draw a distinction between sleeping versus looking away from the road and not paying attention to it? I expect both situations to have similar results with similar levels of danger in a Tesla, and the latter is the bare minimum for autonomous/unattended.

    • afavour a day ago

      You don't need to cite accidents when you're stating the true fact that the system is not approved for unattended use.

dazc a day ago

It's just pretending to do that, seemingly?

Sohcahtoa82 a day ago

If I can't use the center console to pick a song on Spotify without the car yelling at me to watch the road, it's not autonomous.

  • kibwen a day ago

    No, rather, if the manufacturer of the self-driving software doesn't take full legal liability for actions taken by the car, then it's not autonomous. This is the once and final criterion for a self-driving vehicle.

    • Sohcahtoa82 a day ago

      Sounds like we're in agreement then.

      Right now, Tesla skirts legal liability by saying that the driver needs to watch the road and be ready to take control, and then uses measures like detecting your hand on the wheel and tracking your gaze to make sure you're watching the road. If a car driving on FSD crashes, Tesla will say it's the driver's fault for not monitoring the drive.

      • FireBeyond 20 hours ago

        Hell, they'll even hold a press conference touting (selective data from) telemetry to say "The vehicle had been warning him to pay attention prior to the accident!"

        And then four months later when the actual accident investigation comes out, you'll find out that yes, it had. Once. Eighteen minutes prior to the accident.

        And then to add insult to that, for a very long time, Tesla would fight you for access to your own telemetry data if, for example, it was needed in a lawsuit against someone else for an accident.

  • kjkjadksj 18 hours ago

    That is for the lawyers not indicative of capability

    • JumpCrisscross 16 hours ago

      I’ve taken a nap in my Waymos. One can’t in a Tesla. That is a difference in capability.