Comment by Sohcahtoa82

Comment by Sohcahtoa82 21 hours ago

26 replies

My wife and I took a road trip that included time in SF last year and seeing a Waymo was pretty neat.

To save some money, we stayed in downtown Oakland and took the BART into San Francisco. After getting ice cream at the Ghirardelli Chocolate shop, we were headed to Pier 39. My wife has a bad ankle and can't walk very far before needing a break to sit, and we could have taken another bus, we decided to take a Waymo for the novelty of it. It felt like being in the future.

I own a Tesla and have had trials of FSD, but being in a car that was ACTUALLY autonomous and didn't merely pretend to be was amazing. For that short ride of 7 city blocks, it was like being in a sci-fi film.

kjkjadksj 21 hours ago

Why does tesla pretend to be autonomous? My friends with tesla fsd use it fully autonomously. It even finds a spot and parks for them.

  • rurp 20 hours ago

    The company selling the car is adamant that none of their cars are fully autonomous in every single legal or regularity context. Any accident caused by the car is 100% the fault of the driver. But the company markets their cars as fully autonomous. That's pretty much the definition of pretending to be autonomous.

  • nutjob2 21 hours ago

    It's a level 2 system, it can't be operated unattended. Your friends are risking thier lives as several people (now dead) have found out.

    • kjkjadksj 15 hours ago

      I think we are at the point where the data suggests they bear more risk when they drive the tesla themselves. See the bloomburg report on accidents per mile.

    • bananalychee 20 hours ago

      Wikipedia lists two fatal crashes involving Tesla FSD and one involving Waymo.

      • codeka 14 hours ago

        > one involving Waymo

        Are you referring to the one where a Waymo, and several other cars, were stopped at a traffic light, when another car (incidentally, a Tesla) barreled into the traffic stack at 90 MPH, killing several people?

        Because I am not aware of any other fatal accidents where a Waymo was even slightly involved. I think it's, at best, misleading to refer to that in the same sentence as FSD-involved fatalities where FSD was the direct cause.

      • jedberg 20 hours ago

        They key difference is that the Teslas killed their passengers, the Waymo hit someone outside the car (and it wasn't the Waymo's fault, it was hit by another car).

      • JumpCrisscross 13 hours ago

        There is no world in which New York lets Teslas drive autonomously in the next decade. Had they not been grandfathered in in California, I doubt politics there would have allowed it either.

    • boppo1 20 hours ago

      Sources? Havent heard of deaths except total idiots sleepping at 80mph.

      • runako 20 hours ago

        If the car needs any occupant to be awake, it is not an autonomous vehicle.

        Some of the best marketing ever behind convincing people that the word "autonomous" does not mean what we all know it means.

      • Dylan16807 20 hours ago

        Are you trying to draw a distinction between sleeping versus looking away from the road and not paying attention to it? I expect both situations to have similar results with similar levels of danger in a Tesla, and the latter is the bare minimum for autonomous/unattended.

      • afavour 20 hours ago

        You don't need to cite accidents when you're stating the true fact that the system is not approved for unattended use.

  • dazc 21 hours ago

    It's just pretending to do that, seemingly?

  • Sohcahtoa82 20 hours ago

    If I can't use the center console to pick a song on Spotify without the car yelling at me to watch the road, it's not autonomous.

    • kibwen 20 hours ago

      No, rather, if the manufacturer of the self-driving software doesn't take full legal liability for actions taken by the car, then it's not autonomous. This is the once and final criterion for a self-driving vehicle.

      • Sohcahtoa82 20 hours ago

        Sounds like we're in agreement then.

        Right now, Tesla skirts legal liability by saying that the driver needs to watch the road and be ready to take control, and then uses measures like detecting your hand on the wheel and tracking your gaze to make sure you're watching the road. If a car driving on FSD crashes, Tesla will say it's the driver's fault for not monitoring the drive.

        • FireBeyond 16 hours ago

          Hell, they'll even hold a press conference touting (selective data from) telemetry to say "The vehicle had been warning him to pay attention prior to the accident!"

          And then four months later when the actual accident investigation comes out, you'll find out that yes, it had. Once. Eighteen minutes prior to the accident.

          And then to add insult to that, for a very long time, Tesla would fight you for access to your own telemetry data if, for example, it was needed in a lawsuit against someone else for an accident.

    • kjkjadksj 15 hours ago

      That is for the lawyers not indicative of capability

      • JumpCrisscross 13 hours ago

        I’ve taken a nap in my Waymos. One can’t in a Tesla. That is a difference in capability.