Comment by blibble
> But dollars to doughnuts someone will try something like this on a waymo taxi the minute it hits reddit front page.
and once this video gets posted to reddit, an hour later every waymo in the world will be in a ditch
> But dollars to doughnuts someone will try something like this on a waymo taxi the minute it hits reddit front page.
and once this video gets posted to reddit, an hour later every waymo in the world will be in a ditch
How does Waymo fix it? They have to be responsive to some signs (official, legitimate ones such as "Lane closed ahead, merge right") so there will always be some injection pathway.
They've mapped the roads and they don't need to drive into a ditch just because there's a new sign. It probably wouldn't be all that hard to come up with criteria for saying "this new sign is suspicious" and flag it for human review. Also, Waymo cars drive pretty conservatively, and can decide to be even more cautious when something's confusing.
Someone could probably do a DOS attack on the human monitors, though, sort of like what happened with that power outage in San Francisco.
Given Waymo's don't actually connect LLMs to wheels, they are pretty safe.
Even if you fool the sign-recognizing LLM with prompt injection, it'll be an equivalent of wrong road sign. And Waymo is not going to drive into the wall even if someone places a "detour" sign pointing there.