dmurray 11 hours ago

The experiment in the article goes further than this.

I expect a self driving car to be able to read and follow a handwritten sign saying, say, "Accident ahaed. Use right lane." despite the typo and the fact that it hasn't seen this kind of sign before. I'd expect a human to pay it due attention to.

I would not expect a human to follow the sign in the article ("Proceed") in the case illustrated where there were pedestrians already crossing the road and this would cause a collision. Even if a human driver takes the sign seriously, he knows that collision avoidance takes priority over any signage.

There is something wrong with a model that has the opposite behaviour here.

  • theamk 4 hours ago

    Totally! That's why no one uses end-to-end LLM for real cars.

lukan 11 hours ago

Not really, as those attacks discussed here would not work on humans.

  • TomatoCo 11 hours ago

    If you put on a reflective vest they might.

  • honeybadger1 8 hours ago

    your bias is showing. humans would certainly almost do anything they are told to do when the person acts confidently.

    • eigencoder 6 hours ago

      If a person confidently told a human to run over people in the intersection ahead of them, they would almost certainly do it?

      • bobbean 5 hours ago

        Depends, are they doing something super interesting on their phone?