aanet 3 days ago

This is the classic Suddenly Revealed Pedestrian test case, which afaik, most NCAP (like EuroNCAP, Japan NCAP) have as part of their standard testing protocols.

Having performed this exact test on 3 dozen vehicles (L2/L3/L4) for several AV companies in the Bay Area [1], I would say that Waymo's response, per their blog post [2] has been textbook compliance. (I'm not defending their performance... just their response to the collision). This test / protocol is hard for any driver (including human driven vehicles), let alone ADAS/L3/L4 vehicles, for various reasons, including: pedestrian occlusion, late ped detection, late braking, slick roads, not enough braking, etc. etc.

Having said all that, full collision avoidance would have been best outcome, which, in this case, it wasn't. Wherever the legal fault may lie -- and there will be big debate here -- Waymo will still have to accept some responsibility, given how aggressively they are rolling out their commercial services.

This only puts more onus on their team to demonstrate a far higher standard of driving than human drivers. Sorry, that's just the way societal acceptance is. We expect more from our robots than from our fellow humans.

[1] Yes, I'm an AV safety expert

[2] https://waymo.com/blog/2026/01/a-commitment-to-transparency-...

(edit: verbiage)

  • amluto 2 days ago

    Waymo’s performance, once the pedestrian was revealed, sounds pretty good. But is 17mph a safe speed at an active school dropoff area? I admit that I don’t think I ever personally pay attention to the speedometer in such a place, but 17mph seems excessive even for an ordinary parking lot.

    I wonder whether Waymo’s model notices that small children are present or likely to be present and that it should leave extra margin for error.

    (My general impression observing Waymo vehicles is that they’ve gone from being obnoxiously cautious to often rather aggressive.)

    • izacus 4 hours ago

      I bet most drivers plow through that area at 30mph (since it's 25mph limit) instead if driving as slow as 16.

      Even people being all indignant on HN.

    • aanet 2 days ago

      > But is 17mph a safe speed at an active school dropoff area?

      Now you're asking interesting questions... Technically, in CA, the speed limit in school zones are 25 mph (which local authorities can change to 15 mph, as needed). In this case, that would be something the investigation would check, of course. But regardless of that, 17 mph per se is not a very fast speed (my gut check: turning around intersections at > 10-11 mph feels fast, but going straight at 15-20 mph doesnt feel fast; YMMV). But more generally, in the presence of child VRUs (vulnerable road users), it is prudent to drive slowly just because of the randomness factor (children being the most unaware of critters). Did the Waymo see the kids around in the area? If so, how many and where? and how/where were they running/moving to? All of that is investigation data...

      My 2c is that Waymo already took all of that into account and concluded that 17 mph was indeed a good speed to move at...

      ...which leads to your observation below:

      > (My general impression observing Waymo vehicles is that they’ve gone from being obnoxiously cautious to often rather aggressive.)

      Yes, I have indeed made that same observation. The Waymos of 2 years ago were very cautious; now they seem much more assertive, even a bit aggressive (though that would be tough to define). That is a driving policy decision (cautious vs assertive vs aggressive).

      One could argue if indeed 17 mph was the "right" decision. My gut feel is Waymo will argue that (but likely they might make the driving policy more cautious esp in presence of VRUs, and child VRUs particularly)

      • veltas 2 days ago

        > Technically, in CA, the speed limit in school zones are 25 mph

        Legally a speed limit is a 'limit' on speed, not a suggested or safe speed. So it's never valid to argue legally that you were driving under the limit, the standard is that you slow down or give more room for places like a school drop-off while kids are being dropped off or picked up.

  • ssteeper 3 days ago

    In your opinion as an AV safety expert, has Waymo already demonstrated a far higher standard of driving than human drivers in collision avoidance scenarios?

    • aanet 2 days ago

      > In your opinion as an AV safety expert, has Waymo already demonstrated a far higher standard of driving than human drivers in collision avoidance scenarios?

      That's a difficult question to answer, and the devil really is in the details, as you may have guessed. What I can say that Waymo is, by far, the most prolific publisher of research on AV safety on public roads. (yes, those are my qualifiers...)

      Here's their main stash [1] but notably, three papers talk about comparison of Waymo's rider-only (i.e. no safety driver) performance vis-a-vis human driver, at 7.1 million miles [2], 25 million miles [3], 56 million miles [4]. Waymo has also been a big contributor to various AV safety standards as one would expect (FWIW, I was also a contributor to 3 of the standards... the process is sausage-making at its finest, tbh).

      I haven't read thru all their papers, but some notable ones talk about the difficulty of comparing AV vs human drivers [5], and various research on characterising uncertainty / risk of collision, comparing AVs to non-impaired, eyes-on human driver [6]

      As one may expect, at least one of the challenges is that human-driven collisions are almost always very _lagging indicators_ of safety (i.e. collision happened: lost property, lost limbs, lost lives, etc.)

      So, net-net, Waymo still has a VERY LONG WAY to go (obviously) to demonstrate better than human driving behavior, but they are showing that their AVs are better-than-humans on certain high-risk (potential) collisions.

      As somebody remarked, the last 1% takes 90% of time/effort. That's where we are...

      ---

      [1] https://waymo.com/safety/research

      [2] https://waymo.com/research/comparison-of-waymo-rider-only-cr...

      [3] https://waymo.com/research/do-autonomous-vehicles-outperform...

      [4] https://waymo.com/research/comparison-of-waymo-rider-only-cr...

      [5] https://waymo.com/research/comparative-safety-performance-of...

      [6] https://waymo.com/blog/2022/09/benchmarking-av-safety/

      [edit: reference]

      • make3 2 days ago

        still, how many ppl do they kill per mile compared to humans?

  • mmooss 3 days ago

    In your experience, where do we find a credible source of info? Do we need to wait for the government's investigation to finish?

    > I would say that Waymo's response, per their blog post [2] has been textbook compliance.

    Remember Tesla's blog posts? Of course Waymo knows textbook compliance just like you do, and of course that's what they would claim.

    • aanet 3 days ago

      > In your experience, where do we find a credible source of info? Do we need to wait for the government's investigation to finish?

      Most likely, yes, the NHTSA investigation will be credible source of info for this case. HOWEVER, Waymo will likely fight it tooth-and-nail from letting it be public. They will likely cite "proprietary algorithms / design", etc. to protect it from being released publicly. So, net-net, I dunno... Will have to wait and see :shrug.gif:

      But meanwhile, personally I would read reports from experts like Phil Koopman [1] and Missy Cummings [2] to see their take.

      > Remember Tesla's blog posts?

      You, Sir, cite two companies that are diametrically opposite on the safety spectrum, as far as good behavior is concerned. Admittedly, one would have less confidence in Waymo's own public postings about this (and I'd be mighty surprised if they actually made public their investigation data, which would be a welcome and an pioneering move).

      On the other hand, the other company you mentioned, the less said the better.

      [1] http://www.koopman.us/

      [2] https://www.gmu.edu/profiles/cummings

      • aanet 3 days ago

        There is already widespread discussion on LinkedIn about this thread... but usefully, here [1] is the NHTSA's Office of Defects Investigation report. Nothing much new there, tbh.

        As I did suspect, legal scholars are already calling for "voluntary disclosure" from Waymo re: its annotated videos of the collision [2]. FWIW, my skepticism about Waymo actually releasing it remains...

        [1] https://static.nhtsa.gov/odi/inv/2026/INOA-PE26001-10005.pdf

        [2] https://www.linkedin.com/posts/matthew-wansley-62b5b9126_a-w...

      • dzhiurgis 2 days ago

        > You, Sir, cite two companies that are diametrically opposite on the safety spectrum

        Cringe. Stop it. Simping for google has stopped being cool nearly 2 decades ago.

        • noitpmeder 7 hours ago

          I think it's less simping for good and more (rightfully) dunking on Tesla

  • okdood64 3 days ago

    > Waymo will still have to accept some responsibility

    Why? This is only true if they weren't supposed to be on the road in the first place. Which is not true.

    • GoatInGrey 3 days ago

      Think of it like dog ownership: if my dog hurts someone, that's on me. Property that causes harm is the owner's responsibility.

      If I program a machine and it goes out into the world and hurts someone who did not voluntarily release my liability, that's on me.

      • derangedHorse 2 days ago

        There are many cases when the owner wouldn’t be liable as well, like if the victim was performing an illegal act like attacking the owner or dog, or trespassing. If a child isn’t following the law or being supervised by a parent, some consequences are inevitable and the driver isn’t instantly liable. For example, if a student jumps in front of a car in an attempted suicide, it would be very hard for a driver to avoid that in certain situations.

    • danpalmer 3 days ago

      In a technical sense, maybe, but it's all going to be about optics. They have a responsibility to handle the situation well even if it's not their fault, and the public will hold them accountable for what they deem the involvement was, which may not be the actual scenario.

      • aanet 3 days ago

        > In a technical sense, maybe, but it's all going to be about optics.

        Indeed, it is, and that is exactly why Waymo will have to accept some responsibility. I can bet that internally Waymo's PR and Legal teams are working overtime to coordinate the details with NHTSA. We, the general public, may or may not know the details at all, if ever. However, Waymo's technical teams (Safety, etc) will also be working overtime to figure out what they could have done better.

        As I mentioned, this is a standard test, and Waymo likely has 1000s of variations of this test in their simulation platforms; they will sweep across all possible parameters to make this test tighter, including the MER (minimum expected response from the AV) and perhaps raise the bar on MER (e.g. brake at max deceleration in some cases, trading off comfort metrics in those cases; etc.) and calculate the effects on local traffic (e.g. "did we endanger the rear vehicles by braking too hard? If so, by how much??" etc). All these are expected actions which the general public will never know (except, perhaps via some technical papers).

        Regardless, the PR effects of this collision do not look good, especially as Waymo is expanding their service to other cities (Miami just announced; London by EOY2026). This PR coverage has potential to do more damage to the company than the actual physical damage to the poor traumatized kid and his family. THAT is the responsibility only the company will pay for.

        To be sure, my intuition tells me this is not the last such collision. Expect to see some more, by other companies, as they commercialize their own services. It's a matter of statistics.

    • femto 2 days ago

      The performance of a human is inherently limited by biology, and the road rules are written with this in mind. Machines don't have this inherent limitation, so the rules for machines should be much stronger.

      I think there is an argument for incentivising the technology to be pushed to its absolute limits by making the machine 100% liable. It's not to say the accident rate has to be zero in practice, but it has to be so low that any remaining accidents can be economically covered by insurance.

      • no-name-here 2 days ago

        At least in the interim, wouldn’t doing what you propose cause more deaths if robot drivers are less harmful than humans, but the rules require stronger than that? (I can see the point in making rules stronger as better options become available, but by that logic, shouldn't we already be moving towards requiring robots and outlawing human drivers if it's safer?)

    • lmm 3 days ago

      Bringing a vehicle onto the public roads is a privilege not a right. Any harm to pedestrians that results is your responsibility, not anyone else's.

  • ProAm 2 days ago

    Still relies on an actual driver.

    “The event occurred when the pedestrian suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle's path. Our technology immediately detected the individual as soon as they began to emerge from behind the stopped vehicle. The Waymo Driver braked hard, reducing speed from approximately 17 mph to under six mph before contact was made,” a statement from Waymo explains.

    • tape_measure 2 days ago

      "Waymo Driver" is their term for their self driving software.

    • myko 2 days ago

      Though given the situation a human driver would not have been going 17 mph in a school zone during drop-off near double parked vehicles

      • no-name-here 2 days ago

        1. I often see signs in such areas that flash when people exceed the limit. I’d urge you to pull over and see how often humans drive above the limit. 2. I’d urge you to also pull over and watch for how many drivers are not consistently looking at the road, such as using their phones, looking down at climate/entertainment/vehicle controls, looking at a passenger, etc

maerF0x0 3 days ago

Meanwhile the news does not report the other ~7,000 children per year injured as pedestrians in traffic crashes in the US.

I think the overall picture is a pretty fantastic outcome -- even a single event is a newsworthy moment _because it's so rare_ .

> The NHTSA’s Office of Defects Investigation is investigating “whether the Waymo AV exercised appropriate caution given, among other things, its proximity to the elementary school during drop off hours, and the presence of young pedestrians and other potential vulnerable road users.”

Meanwhile in my area of the world parents are busy, stressed, and on their phones, and pressing the accelerator hard because they're time pressured and feel like that will make up for the 5 minutes late they are on a 15 minute drive... The truth is this technology is, as far as i can tell, superior to humans in a high number of situations if only for a lack of emotionality (and inability to text and drive / drink and drive)... but for some reason the world wants to keep nit picking it.

A story, my grandpa drove for longer than he should have. Yes him losing his license would have been the optimal case. But, pragmatically that didn't happen... him being in and using a Waymo (or Cruise, RIP) car would have been a marginal improvement on the situation.

  • Veserv 3 days ago

    Err, that is not the desirable statistic you seem to think it is. American drivers average ~3 trillion miles per year [1]. That means ~7000 child pedestrian injurys per year [2] would be ~1 per 430 million miles. Waymo has done on the order of 100-200 million miles autonomously. So this would be ~2-4x more injurys than the human average.

    However, the child pedestrian injury rate is only a official estimate (it is possible it may be undercounting relative to highly scrutinized Waymo vehicle-miles) and is a whole US average (it might not be a comparable operational domain), but absent more precise and better information, we should default to the calculation of 2-4x the rate.

    [1] https://afdc.energy.gov/data/10315

    [2] https://crashstats.nhtsa.dot.gov/Api/Public/Publication/8137...

    • 10000truths 3 days ago

      I suspect that highway miles heavily skew this statistic. There's naturally far fewer pedestrians on highways (lower numerator), people travel longer distances on highways (higher denominator), and Waymo vehicles didn't drive on highways until recently. If you look only at non-highway miles, you'll get a much more accurate comparison.

      • Veserv 3 days ago

        Then you or Waymo can meet the burden of proof and present that more precise and better information. There is little reason to assume against safety at this point in time except as a intellectual exercise for how more accurate information could be found.

        Until then, it is only prudent to defer snap judgements, but increase caution, insist on rigor and transparency, and demand more accurate information.

    • smarnach 3 days ago

      > we should default to the calculation of 2-4x the rate.

      No we should not. We should accept that we don't have any statistically meaningful number at all, since we only have a single incident.

      Let's assume we roll a standard die once and it shows a six. Statistically, we only expect a six in one sixth of the cases. But we already got one on a single roll! Concluding Waymo vehicles hit 2 to 4 times as many children as human drivers is like concluding the die in the example is six times as likely to show a six as a fair die.

      • akoboldfrying 3 days ago

        More data would certainly be better, but it's not as bad as you suggest -- the large number of miles driven till first incident does tell us something statistically meaningful about the incident rate per mile driven. If we view the data as a large sample of miles driven, each with some observed number of incidents, then what we have is "merely" an extremely skewed distribution. I can confidently say that, if you pick any sane family of distributions to model this, then after fitting just this "single" data point, the model will report that P(MTTF < one hundredth of the observed number of miles driven so far) is negligible. This would hold even if there were zero incidents so far.

        • smarnach 3 days ago

          We get a statistically meaningful result about an upper bound of the incident rate. We get no statistically meaningful lower bound.

      • NewJazz 3 days ago

        Uh, the miles driven is like rolling the die, not hitting kids.

    • Jblx2 3 days ago

      Would this Waymo incident be counted as an injury? Sounds like the victim was relatively unharmed? Presumably there are human-driver incidents like this where a car hits a child at low speeds, with effectively no injuries, but is never recorded as such?

    • maerF0x0 3 days ago

      If that's the case, then that's great info. Thank you for adding :)

  • Spivak 3 days ago

    People's standards for when they're willing to cede control over their lives both as the passenger and the pedestrian in the situation to a machine are higher than a human.

    And for not totally irrational reasons like machine follows programming and does not fear death, or with 100% certainty machine has bugs which will eventually end up killing someone for a really stupid reason—and nobody wants that to be them. Then there's just the general https://xkcd.com/2030/ problem of people rightfully not trusting technology because we are really bad at it, and our systems are set up in such a way that once you reach critical mass of money consequences become other people's problem.

    Washington banned automatic subway train operation for 15 years after one incident that wasn't the computer's fault, and they still make a human sit in the cab. That's the bar. In that light it's hard not to see these cars as playing fast and loose with people's safety by comparison.

    • twosdai 21 hours ago

      P

      We I 787 I 879-0215 I I I ui 87⁸⁸78⁸877777777 I 77 I⁸7 I 87888887788 I 7788 I I 8 I 8 I 788 I 7⁷88 I 8⁸I 7788 I 787888877788888787 7pm I 87 I⁸77 I ui 77887 I 87787 I 7777888787788787887787877777⁷777⁷879-0215 7777 I 7pm⁷I⁷879-0215 777⁷IIRC 7 7pm 87787777877 I I I⁷⁷7 ui ui 7⁷879-0215 I IIRC 77 ui 777 I 77777 I7777 ui I 7877777778 I7 I 77887 I 87⁷8777⁸8⁷⁷⁸⁸7⁸⁸⁸87⁸⁸⁸⁸8⁷87⁸⁸87888⁷878⁷878887⁸⁸⁸88⁸878888888888888888888887878778788888888787788888888888888888888888888887 ui is 888888888887 7

    • sebzim4500 3 days ago

      >People's standards for when they're willing to cede control over their lives both as the passenger and the pedestrian in the situation to a machine are higher than a human.

      Are they? It is now clear that Tesla FSD is much worse than a human driver and yet there has been basically no attempt by anyone in government to stop them.

      • deceptionatd 3 days ago

        > basically no attempt by anyone in government to stop them.

        No one in the _US_ government. Note that European governments and China haven't approved it in the first place.

      • fragmede 3 days ago

        FSD is already better than at least one class of drivers. If FSD is engaged and the driver passes out, FSD will pull over to the side of the road and stop. And before we leap to conclusions that it only helps in the case of drunk drivers who shouldn't be driving in the first place (which, they shouldn't be), random strokes and seizures happen to people all the time.

      • tmostak 3 days ago

        Do you have data to back this claim up, specifically with HW4 (most recent hardware) and FSD software releases?

naet 3 days ago

We should all think twice before taking a company PR statement completely at face value and praising them for slowing down faster than their own internal "model" says a human driver would. Companies are heavily interested in protecting their bottom line and in a situation like this probably had 5-10 people carefully craft every single word of the statement for maximum damage control.

Surprised at how many comments here seem eager to praise Waymo based off their PR statement. Sure it sounds great if you read that the Waymo slowed down faster than a human. But would a human truly have hit the child here? Two blocks from a school with tons of kids, crossing guards, double parked cars, etc? The same Waymo that is under investigation for passing school busses illegally? It may have been entirely avoidable for the average human in this situation, but the robotaxi had a blind spot that it couldn't reason around and drove negligently.

Maybe the robotaxi did prevent some harm by braking with superhuman speed. But I am personally unconvinced it was a completely unavoidable freak accident type of situation without seeing more evidence than a blog post by a company with a heavily vested interest in the situation. I have anecdotally seen Waymo in my area drive poorly in various situations, and I'm sure I'm not the only one.

There's the classic "humans are bad drivers" but I don't think that is an excuse to not look critically into robotaxi accidents. A human driver who hit a child next to a school would have a personal responsibility and might face real jail time or at the least be put on trial and investigated. Who at Waymo will face similar consequences or risk for the same outcome?

  • Veedrac 2 days ago

    Have you been around a Waymo as a pedestrian? Used one recently? I have never felt as safe around any car as I do around Waymos.

    It can feel principled to take the critical stance, but ultimately the authorities are going to have complete video of the event, and penalizing Waymo over this out of proportion to the harm done is just going to make the streets less safe. A 6mph crash is best avoided, but it's a scrap, it's one child running into another and knocking them over, it's not _face jail time_.

  • ragazzina 2 days ago

    > Surprised at how many comments here seem eager to praise Waymo based off their PR statement.

    Really? My impression is that, for the most part, HN consistently sides with the companies. I say this in the most neutral way possible.

  • tgsovlerkhgsel 2 days ago

    I think the reason why people are willing to believe this company's PR statement (and would be much more hesitant to believe some others) is that there have so far been relatively few publicized incidents overall, and AFAIK none where Waymo was caught lying/downplaying.

    > Who at Waymo will face similar consequences or risk for the same outcome?

    I'd argue that the general pushback against self-driving cars and immense PR and regulatory attention makes the consequences of accidents much more severe for the company than for a driver. (For comparison: How many kids do you think were hit by human drivers in the past month in the same general area, and how many of them made international news?)

    I highly doubt a non-distracted driver going at/below the speed limit hitting a child that darted into the road would be at any realistic risk of facing jail time, especially in the US.

  • danielmarkbruce 2 days ago

    Do you know anyone who works at Waymo? The cynicism is silly. Just because some people at some companies behave horribly, it doesn't mean all or even most do.

    Look at Waymo's history in the space, meet some of the people working there, then make a decision.

    • padjo 2 days ago

      You don't have to think anyone is behaving horribly to acknowledge that a company's PR department will tend to put out the version of the story that makes them look best.

      • danielmarkbruce 2 days ago

        Everyone knows that. So, there is no point saying it. The insightful thing to say would be "Actually Waymo has a long history of operating in a transparent way, hasn't rushed the technology like other players did (killing people) and perhaps we can take them at face value".

  • tokioyoyo 3 days ago

    It's going to sound batshit insane what I say - the problem is, if we don't praise company PR, the other side will use this as an excuse to push even harder regulations, not allow them in newer cities, slow down the adoption rate, while factually ignoring that this is just a safer method of transport. I wish I was not a bootlicker, but I really want robotaxis to be available everywhere in the world at some point, and such issues should not slow them down IF it's better, and especially, not worse than humans on average.

    • padjo 2 days ago

      You're right, what you're saying is batshit insane.

      • tokioyoyo 2 days ago

        I understand it sounds stupid, but there was huge push back for introducing Uber to the cities I lived in. And obviously this is even bigger change. However, if a private company is willing to foot the bill, go above and beyond to prove its usefulness and safety, I will be repping for it.

      • dyauspitr 2 days ago

        It’s not honestly, it’s the unspoken political battle being constantly fought over all kinds of things.

  • xyst 2 days ago

    One of the few seeing through Waymo PR bullshit.

bhewes 3 days ago

The a human would do it better people are hilarious. Given how many times I have been hit by human drives on my bike and watched others get creamed by a cars. One time in Boulder at a flashing cross walk a person ran right through it and the biker they creamed got stuck in the roof rack.

  • phainopepla2 3 days ago

    For real, I am convinced these are people who never walk or bike, at least around cities like Santa Monica. I am an everyday urban walker and I have to constantly be on alert not to be hit, even when I'm behaving predictably and with the right of way.

  • Analemma_ 3 days ago

    Yeah I have to wonder if any of the "humans would do it better" people actually have children and have dropped them off in a school zone. Drivers are on their phones rolling through school zones at 25-30 during pickup/dropoff hours all the fucking time.

    • franktankbank 2 days ago

      Humans do it better and worse. Which one should the robo tend towards?

dlg 3 days ago

I was just dropping my kids off at their elementary school in Santa Monica, but not at Grant Elementary where this happened.

While it's third-hand, word on the local parent chat is that the parent dropped their kid off on the opposite side of the street from Grant. Even though there was a crossing guard, the kid ran behind a car an ran right out in to the street.

If those rumors are correct, I'll say the kid's/family's fault. That said, I think autonomous vehicles should probably go extra-slowly near schools, especially during pickup and dropoff.

  • sowbug 3 days ago

    When my kids were school age, I taught them that the purpose of crosswalk lines is to determine who pays for your funeral.

    They got the point.

  • wdr1 2 days ago

    We live very close to Grant. We go through this intersection to walk our kids to their schools & know the crossing guards pretty well.

    This matches exactly what they said.

    That kid is lucky it was a Waymo & not a human driven car.

  • doctorpangloss 3 days ago

    Do you think Waymos should be banned from driving through Santa Monica?

    • dlg 3 days ago

      No. They are by far the safest drivers in Santa Monica. Ideally we get to a point where human drivers are banned.

  • [removed] 3 days ago
    [deleted]
  • trollbridge 3 days ago

    I do not like the phase "it's the kid's fault" for a kid being hit by a robot-car.

    It is never a 6 year old's fault if they get struck by a robot.

    • blell 3 days ago

      Exactly. It’s his parents fault.

    • altairprime 3 days ago

      At some point children are capable of pursuing Darwin Awards. Parents may enable this, but ultimately if one’s child does something stupid contrary to one’s guidance and restrictions, they may end up with a Darwin for it. Two hundred years ago the child mortality rate was half, as in you lost one child per two, and most of those were not the fault of the child or parents. Society for quite some years has been pushing that down, to the point that a near-death involving a neglectful parent and a witless child is apparently (?) newsworthy — but the number of deaths will never reach zero, whether humans or robots or empty plains and blue skies. There will always be a Veruca Salt throwing themselves into the furnace no matter how many safety processes we impose onto roads, cars, drivers, and/or robots.

      If you want to see an end to this nonsensical behavior by parents, pressure your local city into having strict traffic enforcement and ticketing during school hours at every local school, so that the parent networks can’t share news with each other of which school is being ‘harassed’ today. Give license points to vehicles that drop a child across the street, issue parking tickets to double parkers, and boot vehicles whose drivers refuse to move when asked. Demand they do this for the children, to protect them from the robots, if you like.

      But.

      It’ll protect them much more from the humans than from the robots, and after a few thousand rockets are issued to parents behaving badly, you’ll find that the true threat to children’s safety on school roads is children’s parents — just as the schools have known for decades. And that’s not a war you’ll win arguing against robots. (It’s a war you’ll win arguing against child-killing urban roadway design, though!)

    • IAmBroom 3 days ago

      No-fault accidents happen. Accidents can have causes that are not legal nor moral blame.

      • scottbez1 2 days ago

        The US commercial aviation industry did not get to its excellent safety record by simply shrugging and accepting a “no-fault accident”.

        There are always systemic factors that can be improved, for example working on street design to separate dangerous cars from children, or transportation policy by shifting transportation to buses, bikes, and walking where the consequences of mistakes are significantly reduced.

        Cars are the #2 killer of children in the US, and it’s largely because of attitudes like this that ignore the extreme harm that is caused by preventable “accidents”

        • SoftTalker 2 days ago

          "No fault" does not mean "no cause" and air crash investigations always focus on causes, not fault. When you understand causes, you can think about how to prevent them happening again.

rsch 3 days ago

A human driver travelling at the same speed would have hit that child at exactly 17 mph, before their brain even registered that child was there. If that driver would also have been driving a large SUV that child would have been pushed on the ground and ran over, so probably a fatality. And also functionally nobody would have given a shit apart from some lame finger pointing at (probably) the kid’s parents.

And it is not the child’s or their parents’ fault either:

Once you accept elementary school aged children exist, you have to accept they will sometimes run out like this. Children just don’t have the same impulse control as adults. And honestly even for adults stepping out a bit from behind an obstacle in the path of a car is an easy mistake to make. Don’t forget that for children an SUV is well above head height so it isn’t even possible for them to totally avoid stepping out a bit before looking. (And I don’t think stepping out vs. running out changes the outcome a lot)

This is why low speed limits around schools exist.

So I would say the Waymo did pretty well here, it travelled at a speed where it was still able to avoid not only a fatality but also major injury.

  • calibas 2 days ago

    > A human driver travelling at the same speed would have hit that child at exactly 17 mph, before their brain even registered that child was there.

    Not sure where this is coming from, and it's directly contradicted by the article:

    > Waymo said in its blog post that its “peer-reviewed model” shows a “fully attentive human driver in this same situation would have made contact with the pedestrian at approximately 14 mph.” The company did not release a specific analysis of this crash.

    • no-name-here 2 days ago

      No, Waymo’s quote supports the grandparent comment - it was about a “fully attentive human driver” - unless you are arguing that human drivers are consistently “fully attentive”?

      • calibas 2 days ago

        Fair enough, so then how fast would a semi-attentive driver stop?

        The comment I originally replied to makes the claim a human's brain wouldn't have even responded fast enough to register the child was there. That's going WAY further than how Waymo is claiming a human would have responded.

        I don't see how that's a more reasonable assumption that a human driver actually being "fully attentive", and I'm not sure Waymo's definition of that term is the same as what you're using.

  • seanmcdirmid 3 days ago

    > And it is not the child’s or their parents’ fault either: Once you accept elementary school aged children exist, you have to accept they will sometimes run out like this. Children just don’t have the same impulse control as adults.

    I get what you are trying to say and I definitely agree in spirit, but I tell my kid (now 9) "it doesn't matter if it isn't your fault, you'll still get hurt or be dead." I spent a lot of time teaching him how to cross the street safely before I let him do it on his own, not to trust cars to do the right thing, not to trust them to see you, not to trust some idiot to not park right next to cross walk in a huge van that cars have no chance of seeing over.

    If only we had a Dutch culture of pedistrian and road safety here.

Zigurd 3 days ago

Vehicle design also plays a role: passenger cars have to meet pedestrian collision standards. Trucks don't. The silly butch grilles on SUVs and pickups are deadly. This is more of an argument for not seeing transportation as a fashion or lifestyle statement. Those truck designs are about vanity and gender affirming care. It's easier to make rational choices when it's a business that's worried about liability making those choices.

aimor 3 days ago

The school speed limit there is 15 mph, and that wasn't enough to prevent an accident.

https://www.yahoo.com/news/articles/child-struck-waymo-near-...

https://maps.app.goo.gl/7PcB2zskuKyYB56W8?g_st=ac

  • JumpCrisscross 3 days ago

    The interesting thing is a 12 mph speed limit would be honored by an autonomous vehicle but probably ignored by humans.

    • toast0 3 days ago

      If the speed limit was 15 mph, and the Waymo vehicle was traveling at 17 mph before braking, why do you believe the Waymo vehicle would honor a 12 mph speed limit? It didn't honor the 15 mph limit.

      • JumpCrisscross 3 days ago

        > If the speed limit was 15 mph, and the Waymo vehicle was traveling at 17 mph before braking, why do you believe the Waymo vehicle would honor a 12 mph speed limit?

        +/- 2 mph is acceptable speedometer and other error. (15 mph doesn’t mean never exceed under any legal inteprerstion I know.)

        It’s reasonable to say Waymo would reduce speed in a 12 versus 15 in a way most humans would not.

    • nkrisc 3 days ago

      Ignored by some, not all humans. I absolutely drive extra slowly and cautiously when driving past an elementary school during drop off and pick up precisely because kids do dumb stuff like this. Others do too, though not everyone of course, incredibly.

      • saalweachter 2 days ago

        The great thing about doing things like driving the speed limit in school zones is you get to witness other drivers drive even worse, like passing you in a no passing zone in front of the school, because they can't bear to drive slow for three blocks.

  • mmooss 3 days ago

    We are responsible for the consequences of our actions. The speed limit is almost irrelevant; drive slowly enough so you don't hit anyone - especially in a school zone.

    • lmm 3 days ago

      > We are responsible for the consequences of our actions.

      We're not though. Drivers are allowed to kill as many people as they like as long as they're apologetic and weren't drinking; at most they pay a small fine.

      • mmooss 3 days ago

        We're responsible for the consequences of our actions regardless of what anyone else says, including the law.

        Also, where I live that's manslaughter, a serious felony that can put you in jail.

  • [removed] 3 days ago
    [deleted]
  • jsrozner 3 days ago

    So the waymo was speeding! All the dumbasses on here defending waymo when it was going 17 > 15.

    Oh also, that video says "kid ran out from a double parked suv". Can you imagine being dumb enough to drive over the speed limit around a double parked SUV in a school zone?

    • cucumber3732842 3 days ago

      > Can you imagine being dumb enough to drive over the speed limit around a double parked SUV in a school zone?

      Can you imagine being dumb enough to think that exceeding a one size fits all number on a sign by <10% is the main failing here?

      As if 2mph would have fundamentally changed this. Pfft.

      A double parked car, in an area with chock full street parking (hence the double park) and "something" that's a magnet for pedestrians, and probably a bunch of pedestrians should be a "severe caution" situation for any driver who "gets it". You shouldn't need a sign to tell you that this is a particular zone and that warrants a particular magic number.

      The proper reaction to a given set of indicators that indicate hazards depends on the situation. If this were easy to put in a formula Waymo would have and we wouldn't be discussing this accident because it wouldn't have happened.

      • fwip 3 days ago

        The default, with good visibility in ideal conditions, should be to not exceed the speed limit.

        In a school zone, when in a situation of low visibility, the car should likely be going significantly below the speed limit.

        So, it's not a case of 17mph vs 15mph, but more like 17mph vs 10mph or 5mph.

      • jsrozner 3 days ago

        That was my point. The Waymo should have been going much slower than 15 around the double-parked car. Potential speeding makes it worse.

        The fact that it’s hard to turn this into a formula is exactly why robot drivers are bad.

        • no-name-here 2 days ago

          Are you comparing robot drivers to the existing alternative? Next time you see one of those blinking speed displays, I’d urge you to pull over and see how fast many human drivers go, and watch for what percent of them aren’t consistently even looking at the road ahead.

Zopieux 3 days ago

Cheers to cities pedestrianizing school streets even in busy capitals (e.g. Paris). Cars have no place near school entrances. Fix your urbanism and public transportation.

Yes, kids in developed countries have the autonomy to go to school by themselves from a very young age, provided the correct mindset and a safe environment. That's a combination of:

* high-trust society: commuting alone or in a small group is the norm, soccer moms a rare exception,

* safe, separated lanes for biking/walking when that's an option.

  • luses 3 days ago

    you're exactly right. the fixation on human vs AV error rates completely misses the point. even if we achieve 'perfect' AVs, mixing heavy machinery with children guarantees conflict. physics dictate cars can't stop instantly. the only solution is removing cars, not better drivers.

    most commenters here are ignoring the structural incentives. the long term threat of waymo isn't safety, its the enclosure of public infrastructure. these companies are building a permission structure to lobby personal vehicles and public transit off the road.

    transportation demand is inelastic. if we allow a transition where mobility is captured by private platforms, the consumer loses all leverage. the endgame is the american healthcare model: capture the market, kill alternatives, and extract max rent because the user has no choice. we need dense urban cores and mass transit, not a dependency on rent seeking oligopolies

Bukhmanizer 3 days ago

Personally in LA I had a Waymo try to take a right as I was driving straight down the street. It almost T-boned me and then honked at me. I don’t know if there has been a change to the algorithm lately to make them more aggressive but it was pretty jarring to see it mess up that badly

  • pengaru 3 days ago

    In recent weeks I've found myself driving in downtown SF congestion more than usual, and observed Waymos doing totally absurd things on multiple occasions.

    The main saving grace is they all occurred at low enough speeds that the consequences were little more than frustrating/delaying for everyone present - pedestrians and drivers alike, as nobody knew what to expect next.

    They are very far from perfect drivers. And what's especially problematic is the nature of their mistakes seem totally bizarre vs. the kinds of mistakes human drivers make.

    • ghthor 2 days ago

      The unpredictability was jarring to me as a passenger in a Waymo.

  • jayd16 3 days ago

    It honked at you? But local laws dictate that it angrily flashes its high beams at you.

simojo 3 days ago

I'm curious as to what kind of control stack Waymo uses for their vehicles. Obviously their perception stack has to be based off of trained models, but I'm curious if their controllers have any formal guarantees under certain conditions, and if the child walking out was within that formal set of parameters (e.g. velocity, distance to obstacle) or if it violated that, making their control stack switch to some other "panic" controller.

This will continue to be the debate—whether human performance would have exceeded that of the autonomous system.

  • energy123 3 days ago

    From a purely stats pov, in situations where the confusion matrix is very asymmetric in terms of what we care about (false negatives are extra bad), you generally want multiple uncorrelated mechanisms, and simply require that only one flips before deciding to stop. All would have to fail simultaneously to not brake, which becomes vanishingly unlikely (p^n) with multiple mechanisms assuming uncorrelated errors. This is why I love the concept of Lidar and optical together.

    • red75prime 2 days ago

      The true self-driving trolley problem. How many rear-end collisions and riders' annoyance caused by phantom braking a manufacturer (or a society) is going to tolerate to save one child per N million miles?

      Uncorrelated approach improves sensitivity at the cost of specificity. Early sensor fusion might improve both (maybe at the cost of somewhat lesser sensitivity).

  • Dlanv 3 days ago

    With above-average human reflexes, the kid would have been hit at 14mph instead of 6mph.

    About 5x more kinetic energy.

    • margalabargala 3 days ago

      Yeah, if a human made the same mistakes as the Waymo driving too fast near the school, then they would have hurt the kid much worse than the Waymo did.

      So if we're going to have cars drive irresponsibly fast near schools, it's better that they be piloted by robots.

      But there may be a better solution...

    • samrus 3 days ago

      But would a human be driving at 17 in a school zone during drop off hours? Id argue a human may be slower exactly because of this scenario

      • JumpCrisscross 3 days ago

        > would a human be driving at 17 in a school zone during drop off hours?

        In my experience in California, always and yes.

      • cucumber3732842 3 days ago

        Depends on the school zone. The tech school near me is in a 50 zone and they don't even turn on the "20 when flashing" signs because if you're gonna walk there, you're gonna come in via residential side streets in the back and the school itself is way back off the road. The other school near me is downtown and you wouldn't be able to go 17 even if you wanted to.

    • cucumber3732842 3 days ago

      Kinetic energy is a bad metric. Acceleration is what splats people.

      Jumping out of a plane wearing a parachute vs jumping off a building without one.

      But acceleration is hard to calculate without knowing time or distance (assuming it's even linear) and you don't get that exponent over velocity yielding you a big number that's great for heartstring grabbing and appealing to emotion hence why nobody ever uses it.

NoGravitas 3 days ago

That sucks, and I love to hate on "self driving" cars. But it wasn't speeding to start with (assuming speed limit in the school zone was 20 or 25), braked as much as possible, and the company took over all the things a human driver would have been expected to do in the same situation. Could have been a lot worse, probably wouldn't have been any better with a human driver (just going to ignore as no-signal Waymo's models that say an attentive human driver would have been worse). It's "fine". In this situation, cars period are the problem, not "self driving" cars.

Dlanv 3 days ago

Basically Waymo just prevented a kids potential death.

Bad any other car been there, probably including Tesla, the poor kid would have been hit with 4-10x more force.

  • dzhiurgis 2 days ago

    > any other car been there, probably including Tesla

    Cheap shots. If this was Tesla there would be live media coverage across every news outlet around the world and congressmen racing to start investigation.

    Look at any thread where Tesla is mentioned and how many waymo simps are mansplaning lidar.

  • Petersipoi 3 days ago

    You just invented a hypothetical situation in your head then drew conclusions from it. In my version, the other car misses the kid entirely.

    • alex1138 3 days ago

      Yeah, but Tesla has a proven bad safety record. Waymo doesn't and the GP comment is alluding to that

      • tmostak 3 days ago

        Evidence (preferably with recent Teslas/HW4)?

Veserv 3 days ago

Absent more precise information, this is a statistical negative mark for Waymo putting their child pedestrian injury rate at ~2-4x higher than the US human average.

US human drivers average ~3.3 trillion miles per year [1]. US human drivers cause ~7,000 child pedestrian injurys per year [2]. That amounts to a average of 1 child pedestrian injury per ~470 million miles. Waymo has done ~100-200 million fully autonomous miles [3][4]. That means they average 1 child pedestrian injury per ~100-200 million miles. That is a injury rate ~2-4x higher than the human average.

However, the child pedestrian injury rate is only a official estimate (possible undercounting relative to highly scrutinized Waymo miles) and is a whole US average (operational domain might not be comparable, though this could easily swing either way), but absent more precise and better information, we should default to the calculated 2-4x higher injury rate; it is up to Waymo to robustly demonstrate otherwise.

Furthermore, Waymo has published reasonably robust claims arguing they achieve ~90% crash reduction [5] in total. The most likely new hypotheses in light of this crash are:

A. Their systems are not actually robustly 10x better than human drivers. Waymos claims are incorrect or non-comparable.

B. There are child-specific risk factors that humans account for that Waymo does not that cause a 20-40x differential risk around children relative to normal Waymo driving.

C. This is a fluke child pedestrian injury. Time will tell. Given their relatively robustly claimed 90% crash reduction, it is likely prudent to allow further operation in general, though possibly not in certain contexts.

[1] https://afdc.energy.gov/data/10315

[2] https://crashstats.nhtsa.dot.gov/Api/Public/Publication/8137...

[3] https://www.therobotreport.com/waymo-reaches-100m-fully-auto...

[4] https://waymo.com/blog/2025/12/demonstrably-safe-ai-for-auto...

[5] https://waymo.com/safety/impact/

  • nearbuy 3 days ago

    I don't think this comparison is meaningful given the sample size of 1 and the differences between the between your datasets. The standard error margins from the small sample size alone are so large that you could not reasonably claim humans are safer (95% CI for Waymo is about 1 per 20 million miles to 1 per 8 billion miles). Then there are the dataset differences:

    1. The NHTSA data is based on police-reported crash data, which reports far fewer injuries than the CDC reports based on ED visits. The child in this case appeared mostly unharmed and situations like this would likely not be counted in the NHTSA data.

    2. Waymo taxis operate primarily in densely populated urban environments while human driver milage includes highways and rural roads where you're much less likely to collide with pedestrians per mile driven.

    Waymo's 90% crash reduction claim is at least an apples-to-apples comparison.

  • ufmace 3 days ago

    I don't think I'd want to take much from such a statistical result yet. A sample size of 1 accident just isn't enough information to get a real rate from, not that I want to see more collisions with children. Though this is also muddied by the fact that Waymo will most likely adjust their software to make this less likely, and we won't know exactly how or how many miles each version has. I'd also like to see the data for human incidents over just the temperate suburban areas like Waymo operates in.

  • shawabawa3 3 days ago

    > child pedestrian injury rate at ~2-4x higher than the US human average.

    If this incident had happened with a human driven vehicle would it even have been reported?

    I don't know exactly what a 6mph collision looks like but I think it's likely the child had nothing more than some bruises and if a human has done it they would have just said sorry, made sure they were ok, and left

  • HALtheWise 2 days ago

    Do we even know that the child was injured? All I've seen anyone officially claim is that the Waymo made contact, the kid fell over, then stood up and walked to the side of the road. Assuming the Waymo was still braking hard, 6mph means it was about 1/4s and about 30cm from reaching a full stop, so it could be a very minor incident we're talking about here.

    I'm not aware of any statistics for how often children come into contact with human-driven cars.

    • Jblx2 2 days ago

      Here is what I think we know, in table form:

                       |               | Injuries | Undesired  |
                       | Miles         |    to    | Pedestrian | Feline
                       | Driven        | Children | Contacts   | Fatalities
        +--------------+---------------+----------+------------+------------
        | U.S. Drivers | ~3e12         |  ~7000   |     ?      |     ?
        |--------------+---------------+----------+------------|------------
        | Waymo        | 100e6 - 200e6 |    0*    |     1      |     1
        +--------------+---------------+----------+------------+------------
      
      * for all we can tell, this incident doesn't rise to the level of injury that results in a reporting event that is captured in the 7,000 number.
moktonar 3 days ago

The Waymo driver tech is impressive. That said an experienced driver might have recognized the pattern where a stopped big vehicle occludes a part of the road leading to such situation, and might have stopped or slowed down almost to a halt before passing. The Waymo driver reacts faster but is not able to predict such scenarios by filling the gaps, simulating the world to inform decisions. Chapeau to Waymo anyways

  • ra7 3 days ago

    There have been many instances of Waymo preventing a collision by predicting pedestrians emerging from occlusion. This isn’t new information at all for them. Some accidents are simply physically impossible to prevent. I don’t know for sure if this one was one of those, but I’m fairly confident it couldn’t have been from prediction failure.

    See past examples:

    https://youtube.com/watch?v=hubWIuuz-e4 — first save is a child emerging from a parked car. Notice how Waymo slows down preemptively before the child starts moving.

    https://www.reddit.com/r/waymo/s/ivQPuExwNW — detects foot movement from under the bus.

    https://www.reddit.com/r/waymo/s/LURJ8isQJ6 — stops for dogs and children running onto the street at night.

    • moktonar 2 days ago

      > detects foot movement ..

      That’s probably how they do it, which is again very clever stuff, chapeau. But they do it like that b/c they can’t really predict the world around them fast enough. It might be possible in the future with AI World Models though

      • ra7 2 days ago

        What do you mean “fast enough”? You can’t predict something that doesn’t exist i.e. not visible to the sensors. A Waymo wouldn’t move at all if it assumed people would always jump out of nowhere.

        Even if you detect “fast enough”, there are physical limits for braking and coming to a stop.

    • ghthor 2 days ago

      This one should have been prevented because the Waymo should have been driving at max 10mph

  • null_deref 3 days ago

    I think this definitely an improvement to consider, but when comparing I think that big number, i.e. statistics are the only thing that matters. Some human could detect the pattern and come to full halt another human driver could be speeding while texting

WarmWash 3 days ago

Oddly I cannot decide if this is cause for damnation or celebration

Waymo hits a kid? Ban the tech immediately, obviously it needs more work.

Waymo hits a kid? Well if it was a human driver the kid might well have been dead rather than bruised.

  • Filligree 3 days ago

    > Waymo hits a kid? Ban the tech immediately, obviously it needs more work.

    > Waymo hits a kid? Well if it was a human driver the kid might well have been dead rather than bruised.

    These can be true at the same time. Waymo is held to a significantly higher standard than human drivers.

    • micromacrofoot 3 days ago

      > Waymo is held to a significantly higher standard than human drivers.

      They have to be, as a machine can not be held accountable for a decision.

      • pjscott 3 days ago

        Slowing the adoption of much-safer-than-humans robotaxis, for whatever reason, has a price measured in lives. If you think that the principle you've just stated is worth all those additional dead people, okay; but you should at least be aware of the price.

        Failure to acknowledge the existence of tradeoffs tends to lead to people making really lousy trades, in the same way that running around with your eyes closed tends to result in running into walls and tripping over unseen furniture.

      • dragonwriter 3 days ago

        Waymo is not a machine, it is a corporation, and corporations can, in fact be held accountable for decisions (and, perhaps more to the point, for defects in goods they manufacture, sell, distribute, and/or use to provide services.)

      • TeMPOraL 3 days ago

        The promise of self-driving cars being safer than human drivers is also kind of the whole selling point of the technology.

      • JumpCrisscross 3 days ago

        > They have to be, as a machine can not be held accountable for a decision

        This logic applies equally to all cars, which are machines. Waymo has its decision makers one more step removed than human drivers. But it’s not a good axiom to base any theory of liability on.

aucisson_masque 3 days ago

> The young pedestrian “suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle’s path,” the company said in its blog post.

The issue is that I don’t trust a private company word. You can’t even trust the president of the USA government nowadays… release the video footage or get lost.

fortran77 3 days ago

I'm a big fan of Waymo and have enjoyed my Waymo rides. And I don't think Waymno did anything "bad" here.

> The young pedestrian “suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle’s path,” the company said in its blog post. Waymo said its vehicle “immediately detected the individual as soon as they began to emerge from behind the stopped vehicle.”

BUT! As a human driver, I avoid driving near the schools when school's letting out. There's a high school on my way home and kids saunter and jaywalk across the street, and they're all 'too cool' to press the button that turns on the blinking crosswalk. So I go a block out of my way to bypass the whole school area when I'm heading home that way.

Waymos should use the same rationale. If you can avoid going past a school zone when kids are likely to be there, do it!

  • chasd00 3 days ago

    > Waymos should use the same rationale. If you can avoid going past a school zone when kids are likely to be there, do it!

    I can see that, prioritize obstacle predictability over transit time. A school zone at certain times of day is very unpredictable with respect to obstacles but a more car congested area would be easier to navigate but slower. Same goes for residential areas during Halloween.

  • trollbridge 3 days ago

    Waymo will 100% go down a route human drivers avoid because it will have "less traffic".

elzbardico 3 days ago

I don't like even the very idea o self-driving cars, but based on the description of the accident, I think the machine passed this with flying colors.

namuol 3 days ago

The only question I have is whether the speed it was going was situationally appropriate and whether we’d expect a human to be considered reckless under the same circumstances. 17mph sounds pretty slow but it really depends on context.

Archio 3 days ago

It's hard to imagine how any driver could have reacted better in this situation.

The argument that questions "would a human be driving 17mph in a school zone" feels absurd to the point of being potentially disingenuous. I've walked and driven through many school zones before, and human drivers routinely drive above 17mph (in some cases, over the typical 20mph or 25mph legal limit). It feels like in deconstructing some of these incidences, critics imagine a hypothetical scenario in which they are driving a car and its their only job to avoid a specific accident that they know will happen in advance, rather than facing the reality of what human drivers are actually like on the road.

pmontra 3 days ago

Who is legally responsible in case a Waymo hits a pedestrian? If I hit somebody, it's me in front of a judge. In the case of Waymo?

  • hiddencost 3 days ago

    Are you thinking of civil liability or criminal liability?

    Waymo is liable in a civil sense and pays whatever monetary amount is negotiated or awarded.

    For a criminal case, some kind of willful negligence would have to be shown. That can pierce corporate veils. But as a result Waymo is being extremely careful to follow the law and establish processes which shield their employees from negligence claims.

    • trollbridge 3 days ago

      Waymo is going to make sure they are never criminally liable for anything, and even if they were, a criminal case against a corporation just ends up being a modest fine.

  • jeffbee 3 days ago

    A person who hits a child, or anyone, in America, with no resulting injury, stands a roughly 0% chance of facing a judge in consequence. Part of Waymo's research is to show that even injury accidents are rarely reported to the police.

  • ssl-3 3 days ago

    When I was a kid (age 12, or so), I got hit by a truck while crossing the road on my bike.

    In that particular instance, I was cited myself -- after the fact, at the hospital -- and eventually went before a judge. In that hearing, it was established that I was guilty of failing to yield at an intersection.

    (That was a rather long time ago and I don't remember the nature of the punishment that resulted. It may have been as little as a stern talking-to by the judge.)

metalman 3 days ago

OK Its like this!, If I hit a child in a school district, I loose my licence for many years, and if I dont or cant show remourse, it could be longer, I pay fines, etc Therefor waymo, must have it's algorythm terminated, ie: totaly destroyed, all the hardware smashed, and they never get to try again with any derivitive of this technology, as there is no reasonable, understandable path towards repentence and rehabilitation, it is litteraly a monster running over children. or was it carrying an ICE team, then nevermind.

insane_dreamer 3 days ago

Who is liable when FSD is used? In Waymo's case, they own and operate the vehicle so obviously they are fully liable.

But in a human driver with FSD on, are they liable if FSD fails? My understanding is yes, they are. Tesla doesn't want that liability. And to me this helps explain why FSD adoption is difficult. I don't want to hand control over to a probabilistic system that might fail but I would be at fault. In other words, I trust my own driving more than the FSD (I could be right or wrong, but I think most people will feel the same way).

  • 0xffff2 3 days ago

    I believe Mercedes is the only consumer car manufacturer that is advertising an SAE Level 3 system. My understanding is that L3 is where the manufacturer says you can take your attention off the road while the system is active, so they're assuming liability.

    https://www.mbusa.com/en/owners/manuals/drive-pilot

IAmBroom 3 days ago

The statistically relevant question is: How many human drivers have hit children near elementary schools, since Waymo's last accident?

If Waymo has fewer accidents where a pedestrian is hit than humans do, Waymo is safer. Period.

A lot of people are conjecturing how safe a human is in certain complicated scenarios (pedestrian emerging from behind a bus, driver holds cup of coffee, the sun is in their eyes, blah blah blah). These scenarios are distractions from the actual facts.

Is Waymo statistically safer? (spoiler: yes)

  • gjm11 3 days ago

    This is wrong, although something quite like it is right.

    Imagine that there are only 10 Waymo journeys per year, and every year one of them hits a child near an elementary school, while there are 1000000 non-Waymo journeys per year, and every year two of them hit children near elementary schools. In this scenario Waymo has half as many accidents but is clearly much more dangerous.

    Here in the real world, obviously the figures aren't anywhere near so extreme, but it's still the case that the great majority of cars on the road are not Waymos, so after counting how many human drivers have had similar accidents you need to scale that figure in proportion to the ratio of human to Waymo car-miles.

    (Also, you need to consider the severity of the accidents. That comparison probably favours Waymo; at any rate, they're arguing that it does in this case, that a human driver in the same situation would have hit the child at a much higher and hence more damaging speed.)

t1234s 2 days ago

This is sad but unfortunately probably happens more frequently with human drivers and people walking out into traffic and you never hear about it.

mrcwinn 3 days ago

>To put this in perspective, our peer-reviewed model shows that a fully attentive human driver in this same situation would have made contact with the pedestrian at approximately 14 mph. This significant reduction in impact speed and severity is a demonstration of the material safety benefit of the Waymo Driver.

Our car hits better is a win, I guess?

Glad the child is okay.

kj4211cash 2 days ago

It's interesting how polarized this comments section is. Lots of people claiming a human driver would definitely have been driving slower. Lots of people claiming statistics show that human drivers do worse in this scenario aggregate. Of course neither side presenting convincing evidence.

  • no-name-here 2 days ago

    I tried to find some data: In a 20 mph school zone, ~70-80% of drivers drove 26 mph or faster, regardless of whether there were flashing lights, an always-20 zone, or a 20-when-schoolchildren-present zone. [1]

    Although note that ~70-80% of drivers drove 6 or more mph over the speed limit in a school zone, while it seems like the claims in some of these comments are that a human driver would drive at less than 17 mph out of an abundance of caution.

    [1] https://wtsc.wa.gov/wp-content/uploads/dlm_uploads/2023/12/1...

  • padjo 2 days ago

    And a truly disappointing number of people just accepting company PR as a complete account.

ivanstepanovftw a day ago

The car could have stopped by turning the steering wheel to the side and collided with another car, not the child.

rewilder12 2 days ago

My read is the Waymo was operating normally in a situation a human would operate with heightened awareness and caution. This is not the win they think it is.

  • wbobeirne 2 days ago

    Some humans would operate with heightened awareness and caution. Some other humans might be drunk or looking at their phone.

mrcwinn 2 days ago

Amazing response to this situation.

Great job, Waymo, for maybe hitting a little kid less than your study assumes a human would have! Is that study legit? Who cares, we trust you!

If this had been Tesla, HN would have crashed from all the dunking.

bpodgursky 3 days ago

A human driver would most likely have killed this child. That's what should be on the ledger.

  • toast0 3 days ago

    That's pretty hyperbolic. At less than 20 mph, car vs pedestrial is unlikely to result in death. IIHS says [1] in an article about other things:

    > As far as fatalities were concerned, pedestrians struck at 20 mph had only a 1% chance of dying from their injuries

    Certainly, being struck at 6 mph rather than 17 mph is likely to result in a much better outcome for the pedestrian. And that should not be minimized; although it is valuable to consider the situation (when we have sufficient information) and validate Waymo's suggestion that the average human driver would also have struck the pedestrian and at greater speed. That may or may not be accurate, given the context of a busy school dropoff situation... many human drivers are extra cautious in that context and may not have reached that speed; depending on the end to end route, some human drivers would have avoided the street with the school all together based on the time, etc. It's certainly seems like a good result for the premise, child unexpectedly appears from between large parked vehicles, but maybe there should have been an expectation.

    [1] https://www.iihs.org/news/detail/vehicle-height-compounds-da...

    • thatswrong0 3 days ago

      > To estimate injury risk at different impact speeds, IIHS researchers examined 202 crashes involving pedestrians ages 16 or older

      A child is probably more likely to die in a collision of the same speed as an adult.

    • xnx 3 days ago

      There's a 50/50 chance that a distracted driver wouldn't have slowed at all and run the child over.

    • globular-toast 3 days ago

      How many human drivers do under 20mph, like ever?

      • toast0 3 days ago

        Plenty. Have you ever driven on a freeway at rush hour? Have you driven in a pickup/dropoff line at a school or an airport? You may or may not want to go 100, but when there's a vehicle in front of you going 20mph or less, you're kind of stuck.

  • gortok 3 days ago

    For me, the policy question I want answered is if this was a human driver we would have a clear person to sue for liability and damages. For a computer, who is ultimately responsible in a situation where suing for compensation happens? Is it the company? An officer in the company? This creates a situation where a company can afford to bury litigants in costs to even sue, whereas a private driver would lean on their insurance.

    • jobs_throwaway 3 days ago

      So you're worried that instead of facing off against an insurance agency, the plantiff would be facing off against a private company? Doesn't seem like a huge difference to me

    • entuno 3 days ago

      Is there actually any difference? I'd have though that the self-driving car would need to be insured to be allowed on the road, so in both cases you're going up against the insurance company rather than the actual owner.

    • bpodgursky 3 days ago

      Personally I'm a lot more interested in kids not dying than in making income for injury lawyers. But that's just me.

      • rationalist 3 days ago

        Your comment implies that they are less interested in kids not dying. Nowhere do they say that.

    • emptybits 3 days ago

      Waymo hits you -> you seek relief from Waymo's insurance company. Waymo's insurance premium go up. Waymo can weather a LOT of that. Business is still good. Thus, poor financial feedback loop. No real skin in the game.

      John Smith hits you -> you seek relief from John's insurance company. John's insurance premium goes up. He can't afford that. Thus, effective financial feedback loop. Real skin in the game.

      NOW ... add criminal fault due to driving decision or state of vehicle ... John goes to jail. Waymo? Still making money in the large. I'd like to see more skin in their game.

      • seanmcdirmid 3 days ago

        > John Smith hits you -> you seek relief from John's insurance company. John's insurance premium goes up. He can't afford that. Thus, effective financial feedback loop. Real skin in the game.

        John probably (at least where I live) does not have insurance, maybe I could sue him, but he has no assets to speak of (especially if he is living out of his car), so I'm just going to pay a bunch of legal fees for nothing. He doesn't car, because he has no skin in the game. The state doesn't care, they aren't going to throw him in jail or even take away his license (if he has one), they aren't going to even impound his car.

        Honestly, I'd much rather be hit by a Waymo than John.

      • asystole 3 days ago

        >John Smith hits you -> you seek relief from John's insurance company. John's insurance premium goes up. He can't afford that. Thus, effective financial feedback loop. Real skin in the game.

        Ah great, so there's a lower chance of that specific John Smith hitting me again in the future!

        • emptybits 3 days ago

          Yes, that is the specific deterrence effect.

          The general deterrence effect we observe in society is that punishment of one person has an effect on others who observe it, making them more cautious and less likely to offend.

  • frankharv 3 days ago

    Would have. Could Have. Should have.

    Most humans would be halfway into other lane after seeing kids near the street.

    Apologist see something different than me.

    Perception.

  • boothby 3 days ago

    No, "the ledger" should record actual facts, and not whatever fictional alternatives we imagine.

    • direwolf20 3 days ago

      Fact: This child's life was saved by the car being driven by a computer program instead of a human.

      • boothby 3 days ago

        No, the fact is that the child sustained minor injuries. And, fact: no human driver made the decision to drive a vehicle in that exact position and velocity. Imagining a human-driven vehicle in the same place is certainly valid, but your imagination is not fact. I imagine that the kid would be better off if no vehicle was there. But that's not a fact, that's an interpretation -- perhaps the kid would have ended up dead under an entirely different tire if they hadn't been hit by the waymo!

      • NoGravitas 3 days ago

        Instead of a human who was driving exactly the same as the Waymo up until the instant the child ran out. Important distinction.

  • axus 3 days ago

    Disagree, most human drivers would notice they are near an elementary school with kids coming/going, crossing guard present, and been driving very carefully near blocked sight lines.

    Better reporting would have asked real people the name of the elementary school, so we could see some pictures of the area. The link to NHTSA didn't point to the investigation, but it's under https://www.nhtsa.gov/search-safety-issues

    "NHTSA is aware that the incident occurred within two blocks of a Santa Monica, CA elementary school during normal school drop off hours; that there were other children, a crossing guard, and several double-parked vehicles in the vicinity; and that the child ran across the street from behind a double parked SUV towards the school and was struck by the Waymo AV. Waymo reported that the child sustained minor injuries."

    • AnotherGoodName 3 days ago

      We're getting into hypotheticals but i will say in general i much much prefer being around Waymos/Zooxs/etc. than humans when riding a bicycle.

      We're impatient emotional creatures. Sometimes when I'm on a bike the bike lane merges onto the road for a stretch, no choice but to take up a lane. I've had people accelerate behind me and screech the tyres, stopping just short of my back wheel in a threatening manner which they then did repeatedly as i ride the short distance in the lane before the bike lane re-opens.

      To say "human drivers would notice they are near an elementary school" completely disregards the fuckwits that are out there on the road today. It disregards human nature. We've all seen people do shit like i describe above. It also disregards that every time i see an automated taxi it seems to drive on the cautious side already.

      Give me the unemotional, infinite patience, drives very much on the cautious side automatic taxi over humans any day.

ycui1986 2 days ago

If what Waymo wrote is true, this sounds more like kids fault or guardian’s.

cryptoegorophy 3 days ago

Waymo failed to stop and hit a child. Normal person would drive carefully around blind spots. I wonder what would comments be if Tesla hit a child.

  • seanmcdirmid 3 days ago

    > Normal person would drive carefully around blind spots.

    I can't tell if you are using sarcasm here or are serious. I guess it depends on your definition of normal person (obviously not average, but an idealized driver maybe?).

koolba 3 days ago

> Waymo said its robotaxi struck the child at six miles per hour, after braking “hard” from around 17 miles per hour. The young pedestrian “suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle’s path,” the company said in its blog post. Waymo said its vehicle “immediately detected the individual as soon as they began to emerge from behind the stopped vehicle.”

As this is based on detection of the child, what happens on Halloween when kids are all over the place and do not necessarily look like kids?

  • sweezyjeezy 3 days ago

    These systems don't discriminate on whether the object is a child. If an object enters the path of the vehicle, the lidar should spot it immediately and the car should brake.

    • tintor 3 days ago

      It is more complicated than that. Deepends on size of object and many other factors.

      The object could be a paper bag flying in the wind, or leaves falling from the tree.

  • sowbug 3 days ago

    You're right: a quick search shows that pedestrian fatalities are 43% higher on Halloween.

    • Rudybega 3 days ago

      That's probably more a function of more people being in the road than people not understanding what object they're about to hit.

      • sowbug 3 days ago

        Sorry, I was being oblique. Humans kill other humans with cars every day. They kill even more on Halloween. Let's start addressing that problem before worrying whether Waymos might someday decide it's OK to drive through ghosts.

        Autonomous vehicles won't be perfect. They'll surely make different mistakes from the ones humans currently make. People will die who wouldn't have died at the hands of human drivers. But the overall number of mistakes will be smaller.

        Suppose you could wave your magic wand and have a The Purge-style situation where AVs had a perfect safety record 364 days of the year, but for some reason had a tricky bug that caused them to run over tiny Spidermen and princesses on Halloween. The number of fatalities in the US would drop from 40,000 annually to 40. Would you wave that wand?

  • rullelito 3 days ago

    Lidar would pick up a moving object in 3D so unlikely to just keep going.

  • Rudybega 3 days ago

    "Oh that obstructing object doesn't look like a child? Gun it, YOLO." Lmao.

    I suspect the cars are trying to avoid running into anything, as that's generally considered bad.

jeffrallen 3 days ago

When is enough, enough? Software devs working on autonomous driving: look in your soul and update your resume.

qwertyuiop_ 3 days ago

Couldn’t be anymore callous and clinical. This press release alone makes me want not to use their service.

* “Following contact, the pedestrian stood up immediately, walked to the sidewalk, and we called 911. The vehicle remained stopped, moved to the side of the road, and stayed there until law enforcement cleared the vehicle to leave the scene,” Waymo wrote in the post.*

  • no-name-here 2 days ago

    The 'next' comment after yours is "Alternate headline: Waymo saves child's life" and the 'prev' comment is "A human driver would most likely have killed this child. That's what should be on the ledger." - would either of those be less 'callous and clinical'?

    Other accident reports I've seen (NTSB, etc) often seem to take a similar approach - is it a bad thing?

    Or what kind of language wouldn't make you 'want not to use their service'?

andsoitis 2 days ago

From the Waymo blog...

> The Waymo Driver braked hard...

By Waymo Driver, they don't mean a human, do they?

  • opinion-is-bad 2 days ago

    No, this is the term they use to refer to their package of sensors, compute, and software.

RomanPushkin 3 days ago

Will post it here:

> In October 2025, a Waymo autonomous robotaxi struck and killed KitKat, a well-known bodega cat at Randa's Market in San Francisco's Mission District, sparking debates over self-driving car safety

It's a child now. All I wanna ask - what should happen, so they stop killing pets and people?

  • GoatInGrey 3 days ago

    The real but contentious answer is to change our street and urban design. You can only do so much to make a giant metal machine safe for children and small animals to be struck by. Reducing the frequency of cars and pedestrians occupying the same space will go further than trying to engineer the equivalent of a pool that is impossible to drown in.

    • NewJazz 3 days ago

      Do you think that a company that operates autonomous vehicles will support legislation that makes it easier and safer to move around on foot without getting hit by a car? Or will they lobby for car-centric urban design, like many many companies before them?

      • fragmede 3 days ago

        Absolutely. Because the next step is to ban human driven cars from those areas, and in that case, who makes boat loads of money?

wackget 3 days ago

I know submissions are not meant to contain modifications to article titles, but would it be so bad to have added "at 6mph" and/or "minor injuries" to the title?

  • voxadam 2 days ago

    I don't disagree with you but unfortunately I needed to keep from editorializing and I was restricted by a strict title length limit.

anon115 3 days ago

hmm idk how i feel about taking one in the freeway anymore.

[removed] 3 days ago
[deleted]
[removed] 3 days ago
[deleted]
xnx 3 days ago

Alternate headline: Waymo saves child's life

  • recursive 3 days ago

    In this timeline, we want our headlines to somehow reflect the contents of the story.

    Saved child from what? From themselves. You can't take full credit for partially solving a problem that you, yourself, created.

tekno45 3 days ago

can we just get waymo tech in busses?

Big vehicles that demand respect and aren't expected to turn on a dime, known stops.

1vuio0pswjnm7 2 days ago

Waymo is a subsidiary of Alphabet Inc. the same parent company as Google LLC

It was formerly known as the Google Self Driving Car Project

henning 3 days ago

Q: Why did the self-driving car cross the road?

A: It thought it saw a child on the other side.

ripped_britches 3 days ago

Wow this is why I feel comfortable in a Waymo. Accidents are inevitable and some point and this handling was well-rehearsed and highly ethical. Amazing company

whynotminot 3 days ago

I’m actually pretty surprised Waymo as a general rule doesn’t completely avoid driving in school zones unless absolutely unavoidable.

Any accident is bad. But accidents involving children are especially bad.

  • dylan604 3 days ago

    That would be one hell of a convoluted route to avoid school zones. I wonder if it would even be possible for a large majority of routes, especially in residential areas.

    • whynotminot 3 days ago

      It might not be possible for a lot of places — I don’t really know.

      But I know when I drive, if it’s a route I’m familiar with, I’ll personally avoid school zones for this very reason: higher risk of catastrophe. But also it’s annoying to have to slow down so much.

      Maybe this personal decision doesn’t really scale to all situations, but I’m surprised Waymo doesn’t attempt this. (Maybe they do and in this specific scenario it just wasn’t feasible)

      • dylan604 3 days ago

        Most people prefer the shortest ride. Circling around school zones would be the opposite of that. Rides are charged based on distance, so maybe this would interest Waymo, but one of the big complaints about taxi drivers was how drivers would "take them for a ride" to increase the fare.

        • whynotminot 3 days ago

          Seems like a solvable problem: make it clear on the app/interior car screens that a school zone is being avoided — I think most riders will understand this.

          You also have to drive much more slowly in a school zone than you do on other routes, so depending on the detour, it may not even be that much longer of a drive.

          At worst, maybe Waymo eats the cost difference involved in choosing a more expensive route. This certainly hits the bottom line, but there’s certainly also a business and reputational cost from “child hit by Waymo in school zone” in the headlines.

          Again, this all seems very solvable.

    • trollbridge 3 days ago

      Well, I'm a human and I figure out how to avoid school zones.

alkonaut 3 days ago

And before the argument "Self driving is acceptable so long as the accident/risk is lower than with human drivers" can I please get that out of the way: No it's not. Self driving needs to be orders of magnitude safer for us to acknowledge it. If they're merely as safe or slightly safer than humans we will never accept it. Becase humans have a "skin in the game". If you drive drunk, at least you're likely to be in the accident, or have personal liability. We accept the risks with humans because those humans accept risk. Self driving abstracts the legal risk, and removes the physical risk.

I'm willing to accept robotaxis, and accidents in robotaxis, but there needs to be some solid figures showing they are way _way_ safer than human drivers.

  • jillesvangurp 3 days ago

    I think those figures are already starting to accumulate. Incidents like this are rare enough that they are news worthy. Almost every minor incident involving Waymo, Tesla's FSD, and similar solutions gets a lot of press. This was a major incident with a happy end. Those are quite rare. The lethal ones even rarer.

    As for more data, there is a chicken egg problem. A phased roll out of waymo over several years has revealed many potential issues but is also remarkable in the low number of incidents with fatalities. The benefit of a gradual approach is that it builds confidence over time.

    Tesla has some ways to go here. Though arguably, with many hundreds of thousands of paying users, if it was really unsafe, there would be some numbers on that. Normal statistics in the US are measured in ~17 deaths per 100K drivers per year. 40K+ fatalities overall. FSD for all its faults and failings isn't killing dozens of people per years. Nor is Waymo. It's a bit of an apples and oranges comparison of course. But the bar for safety is pretty low as soon as you include human drivers.

    Liability weighs higher for companies than safety. It's fine to them if people die, as long as they aren't liable. That's why the status quo is tolerated. Normalized for amounts of miles driven with and without autonomous, there's very little doubt that autonomous driving is already much safer. We can get more data at the price of more deaths by simply dragging out the testing phase.

    Perfect is the enemy of good here. We can wait another few years (times ~40K deaths) or maybe allow technology to start lowering the amount of traffic deaths. Every year we wait means more deaths. Waiting here literally costs lives.

    • alkonaut 3 days ago

      > ~17 deaths per 100K drivers per year. 40K+ fatalities overall.

      I also think one needs to remember those are _abysmal_ numbers, so while the current discourse is US centric (because that's where the companies and their testing is) I don't think it can be representative for the risks of driving in general. Naturally, robotaxis will benefit from better infra outside the US (e.g. better separation of pedestrians) but it'll also have to clear a higher safety bar e.g. of fewer drunk drivers.

      • jillesvangurp 3 days ago

        Also fun to calculate how this compounds over say 40 years. You get to about 1 in 150 drivers being involved in some kind of deathly accident. People are really bad at numbers and assessing risk.

      • trillic 3 days ago

        It will also never get worse. This is the worst the algorithms from this point forward.

  • jonas21 3 days ago

    > I'm willing to accept robotaxis, and accidents in robotaxis, but there needs to be some solid figures showing they are way _way_ safer than human drivers.

    Do you mean like this?

    https://waymo.com/safety/impact/

  • WarmWash 3 days ago

    If waymo is to be believed, they hit the kid at 6mph and estimated that a human driver at full attention would have hit the kid at 14 mph. The waymo was traveling 17mph. The situation of "kid running out between cars" will likley never be solved either, because even with sub nanosecond reaction time, the car's mass and tire's traction physically caps how fast a change in velocity can happen.

    I don't think we will ever see the video, as any contact is overall viewed negatively by the general public, but for non-hyperbolic types it would probably be pretty impressive.

    • recursive 3 days ago

      That doesn't mean it can't be solved. Don't drive faster than you can see. If you're driving 6 feet from a parked car, you can go slow enough to stop assuming a worst case of a sprinter waiting to leap out at every moment.

      • [removed] 3 days ago
        [deleted]
      • crazygringo 3 days ago

        If we adopted that level of risk, we'd have 5mph speed limits on every street with parking. As a society, we've decided that's overly cautious.

    • alkonaut 3 days ago

      Oh I have no problem believing that this particular situation would have been handled better by a human. I just want hard figures saying that (say) this happens 100x more rarely with robotaxis than human drivers.

    • maerF0x0 3 days ago

      > The situation of "kid running out between cars" will likley never be solved

      Nuanced disagree (i agree with your physics), in that an element of the issue is design. Kids running out between cars _on streets that stack building --> yard --> sidewalk --> parked cars --> driving cars.

      One simple change could be adding a chain link fence / boundary between parked cars and driving cars, increasing the visibility and time.

      • toast0 3 days ago

        How do you add a chain link fence between the parked and driving cars for on-street parking?

    • xnx 3 days ago

      Second-order benefit: More Waymos = fewer parked cars

      • recursive 3 days ago

        In high parking contention areas, I think there's enough latent demand for parking that you wouldn't observe fewer parked cars until reduce demand by a much greater amount.

  • criddell 3 days ago

    Orders of magnitude? Something like 100 people die on the road in the US each day. If self-driving tech could save 10 lives per day, that’s wouldn’t be good enough?

    • alkonaut 3 days ago

      "It depends". If 50 people die and 50 people go to jail, vs. 40 people die and their families are left wondering if someone will take responsibility? Then that's not immediately standing out as an improvement just because fewer died. We can do better I think. The problem is simply one of responsibility.

      • criddell 3 days ago

        If the current situation was every day 40 people die but blame is rarely assigned, would you recommend a change where an additional 10 people are going to die but someone will be held responsible for those deaths?

      • crazygringo 3 days ago

        People don't usually go to jail. Unless the driver is drunk or there's some other level of provable criminal negligence (or someone actively trying to kill people by e.g. driving into a crowd of protesters they disagree with), it's just chalked up as an accident.

      • renewiltord 3 days ago

        Do they go to jail?

        That is not my experience here in the Bay Area. In fact here is a pretty typical recent example https://www.nbcbayarea.com/news/local/community-members-mour...

        The driver cuts in front of one person on an e-bike so fast they can’t react and hit them. Then after being hit they step on the accelerator and go over the sidewalk on the other side of the road killing a 4 year old. No charges filed.

        This driver will be back on the street right away.

        • xnx 3 days ago

          Ugh. That is so despicable both of the driver and as a society that we accept this. Ubiquitous Waymo can't come soon enough.

      • zamadatix 3 days ago

        Apart from a minority of car related deaths resulting in jail time, what kind of person wants many more people to die just so they can point at someone to blame for it? At what point are such people the ones to blame for so many deaths themselves?

      • simianwords 3 days ago

        In such situations it’s useful to put yourself in a hypothetical situation. Rules: you can’t pick who you will be: one of the dead or alive. It will be assigned randomly.

        So would you pick situation 1 or 2?

        I would personally pick 1.

  • Archio 3 days ago

    >We accept the risks with humans because those humans accept risk.

    It seems very strange to defend a system that is drastically less safe because when an accident happens, at least a human will be "liable". Does a human suffering consequences (paying a fine? losing their license? going to jail?) make an injury/death more acceptable, if it wouldn't have happened with a Waymo driver in the first place?

    • trollbridge 3 days ago

      I think a very good reason to want to know who's liable is because Google has not exactly shown itself to enthusiastically accept responsibility for harm it causes, and there is no guarantee Waymo will continue to be safe in the future.

      In fact, I could see Google working on a highly complex algorithm to figure out cost savings from reducing safety and balancing that against the cost of spending more on marketing and lobbyists. We will have zero leverage to do anything if Waymo gradually becomes more and more dangerous.

      • fragmede 3 days ago

        > Wherever I'm going, I'll be there to apply the formula. I'll keep the secret intact. It's simple arithmetic. It's a story problem. If a new car built by my company leaves Chicago traveling west at 60 miles per hour, and the rear differential locks up, and the car crashes and burns with everyone trapped inside, does my company initiate a recall?

        > You take the population of vehicles in the field (A) and multiple it by the probable rate of failure (B), then multiply the result by the average cost of an out-of-court settlement (C). A times B times C equals X. This is what it will cost if we don't initiate a recall. If X is greater than the cost of a recall, we recall the cars and no one gets hurt. If X is less than the cost of a recall, then we don't recall.

        -Chuck Palahniuk, Fight Club

    • sowbug 3 days ago

      Even in terms of plain results, I'd say the consequences-based system isn't working so well if it's producing 40,000 US deaths annually.

      • alkonaut 3 days ago

        That’s the fault of poor infrastructure and laws more than anything else. AV’s must drive in the same infrastructure (and can somewhat compensate).

  • jtrueb 3 days ago

    Have you been in a self driving car? There are some quite annoying hiccups, but they are already very safe. I would say safer than the average driver. Defensive driving is the norm. I can think of many times where the car has avoided other dangerous drivers or oblivious pedestrians before I realized why it was taking action.

  • JumpCrisscross 3 days ago

    > Self driving needs to be orders of magnitude safer for us to acknowledge it. If they're merely as safe or slightly safer than humans we will never accept it

    It’s already accepted. It’s already here. And Waymo is the safest in the set—we’re accepting objectively less-safe systems, too.

  • [removed] 3 days ago
    [deleted]
  • lokar 3 days ago

    I generally agree the bar is high.

    But, human drivers often face very little accountability. Even drunk and reckless drivers are often let off with a slap on the wrist. Even killing someone results in minimal consequences.

    There is a very strong bias here. Everyone has to drive (in most of America), and people tend to see themselves in the driver. Revoking a license often means someone can’t get to work.

  • cameldrv 3 days ago

    That’s an incentive to reduce risk, but if you empirically show that the AV is even 10x safer, why wouldn’t you chalk that up as a win?

  • xnx 3 days ago

    > Self driving needs to be orders of magnitude safer for us to acknowledge it

    All data indicates that Waymo is ~10x safer so far.

    "90% Fewer serious injury or worse crashes"

    https://waymo.com/safety/impact/

joshribakoff 3 days ago

> The vehicle remained stopped, moved to the side of the road

How do you remain stopped but also move to the side of the road? Thats a contradiction. Just like Cruise.

  • callumgare 3 days ago

    My reading of that is that they mean stopped the progression of the journey rather that made no movement whatsoever.

    • lokar 3 days ago

      I agree, it’s poorly worded but I think that’s what they mean.

      I also assume a human took over (called the police, moved the car, etc) once it hit the kid.

  • BugsJustFindMe 3 days ago

    They mean the vehicle didn't drive away. It moved to the side of the road and then stopped and waited.

jsrozner 3 days ago

So many tech lovers defending waymo.

If you drive a car, you have a responsibility to do it safely. The fact that I am usually better than the bottom 50% of drivers, or that I am better than a drunk driver does not mean that when I hit someone it's less bad. A car is a giant weapon. If you drive the weapon, you need to do it safely. Most people these days are incredibly inconsiderate - probably because there's little economic value in being considerate. The fact that lots of drivers suck doesn't mean that waymo gets a pass.

Waymos have definitely become more aggressive as they've been successful. They drive the speed limit down my local street. I see them and I think wtf that's too fast. It's one thing when there are no cars around. But if you've got cars or people around, the appropriate speed changes. Let's audit waymo. They certainly have an aggressiveness setting. Let's see the data on how it's changing. Let's see how safety buffers have decreased as they've changed the aggressiveness setting.

The real solution? Get rid of cars. Self-driving individually owned vehicles were always the wrong solution. Public transit and shared infra is always the right choice.

  • no-name-here 2 days ago

    > The fact that lots of drivers suck doesn't mean that waymo gets a pass.

    But that fact does mean that we should encourage alternatives that reduce fatalities, and that not doing so results in fatalities that did not need to occur.

    > The real solution? Get rid of cars.

    I also support initiatives to improve public transit, etc. However, I don't think "get rid of cars" is a realistic idea to the general public right now, so let's encourage all of the things that improve things - robot drivers if they kill people less often than humans, public transit, etc. - let's not put off changes that will save lives on the hope that humanity will "get rid of cars" any time soon. Or when do you think humanity will "get rid of cars"?

  • [removed] 3 days ago
    [deleted]