Comment by kjkjadksj
Comment by kjkjadksj a day ago
Why does tesla pretend to be autonomous? My friends with tesla fsd use it fully autonomously. It even finds a spot and parks for them.
Comment by kjkjadksj a day ago
Why does tesla pretend to be autonomous? My friends with tesla fsd use it fully autonomously. It even finds a spot and parks for them.
It's a level 2 system, it can't be operated unattended. Your friends are risking thier lives as several people (now dead) have found out.
Wikipedia lists two fatal crashes involving Tesla FSD and one involving Waymo.
> one involving Waymo
Are you referring to the one where a Waymo, and several other cars, were stopped at a traffic light, when another car (incidentally, a Tesla) barreled into the traffic stack at 90 MPH, killing several people?
Because I am not aware of any other fatal accidents where a Waymo was even slightly involved. I think it's, at best, misleading to refer to that in the same sentence as FSD-involved fatalities where FSD was the direct cause.
Yes. [1] That incident got considerable publicity in the San Francisco media. But not because of the Waymo.[2][3]
Someone was driving a Tesla on I-280 into SF. They'd previously been involved in a hit-and-run accident on the freeway. They exited I-280 at the 6th St. off ramp, which is a long straightaway. They entered surface streets at 98 MPH in a 25 MPH zone, ran through a red light, and reached the next intersection, where traffic was stopped at a red light. The Tesla Model Y plowed into a lane of stopped cars, killing one person and one dog, injuring seven others, and demolishing at least six vehicles. One of the vehicles waiting was a Waymo, which had no one on board at the time.
The driver of the Tesla claims their brakes failed. "Police on Monday booked Zheng on one count of felony vehicular manslaughter, reckless driving causing injury, felony vandalism and speeding."[2]
[1] https://www.nbcbayarea.com/investigations/waymo-multi-car-wr...
[2] https://www.sfchronicle.com/sf/article/crash-tesla-waymo-inj...
The question should be less who was at fault and more would a human driver have reacted better in that situation and avoided the fatality. I'm not sure why you think that whether the fatality occurred inside or outside of the car changes the calculus, but in that case only one of the two documented Tesla FSD-related fatalities killed the driver. Judging by the incident statistics of Tesla's Autopilot going back over half a decade, I'm pretty sure it's significantly safer than the average human driver and continues to improve, and the point of comparison in the original post was with human driving rather than Waymo. I have no doubt that Waymo, with its constrained operating areas and parameters, is safer in aggregate than Tesla's general-purpose FSD system.
Only one of the two, and it's not nearly enough data to draw a conclusion one way or another in any case.
Wikipedia lists at least 28 fatal crashes involving Tesla FSD:
https://en.wikipedia.org/wiki/List_of_Tesla_Autopilot_crashe...
FSD is not Autopilot despite the names being conflated today, but even if you want to count all 28, it's not enough to compare raw numbers of fatal incidents without considering the difference in scale. That's not to justify taking your eyes off the road when enabling FSD on a Tesla, but the OP did not suggest that either anyway.
There is no world in which New York lets Teslas drive autonomously in the next decade. Had they not been grandfathered in in California, I doubt politics there would have allowed it either.
Are you trying to draw a distinction between sleeping versus looking away from the road and not paying attention to it? I expect both situations to have similar results with similar levels of danger in a Tesla, and the latter is the bare minimum for autonomous/unattended.
If I can't use the center console to pick a song on Spotify without the car yelling at me to watch the road, it's not autonomous.
Sounds like we're in agreement then.
Right now, Tesla skirts legal liability by saying that the driver needs to watch the road and be ready to take control, and then uses measures like detecting your hand on the wheel and tracking your gaze to make sure you're watching the road. If a car driving on FSD crashes, Tesla will say it's the driver's fault for not monitoring the drive.
Hell, they'll even hold a press conference touting (selective data from) telemetry to say "The vehicle had been warning him to pay attention prior to the accident!"
And then four months later when the actual accident investigation comes out, you'll find out that yes, it had. Once. Eighteen minutes prior to the accident.
And then to add insult to that, for a very long time, Tesla would fight you for access to your own telemetry data if, for example, it was needed in a lawsuit against someone else for an accident.
I’ve taken a nap in my Waymos. One can’t in a Tesla. That is a difference in capability.
The company selling the car is adamant that none of their cars are fully autonomous in every single legal or regularity context. Any accident caused by the car is 100% the fault of the driver. But the company markets their cars as fully autonomous. That's pretty much the definition of pretending to be autonomous.