Comment by jonplackett

Comment by jonplackett 3 hours ago

4 replies

The air Canada chatbot that mistakenly told someone they can cancel and be refunded for a flight due to a bereavement is a good example of this. It went to court and they had to honour the chatbot’s response.

It’s quite funny that a chatbot has more humanity than its corporate human masters.

delichon 8 minutes ago

That policy would be fraudulently exploited immediately. So is it more humane or more gullible?

I wonder if it would make a different choice if designed to include the interests of shareholders, employees and other stakeholders, as well as customers.

kebman 17 minutes ago

Not AI, but similar sounding incident in Norway. Some traders found a way to exploit another company's trading bot at the Oslo Stock Exchange. The case went to court. And the court's ruling? "Make a better trading bot."

RobotToaster 34 minutes ago

Chatbots have no fear of being fired, most humans would do the same in a similar position.

shinycode an hour ago

What a nice side effect, unfortunately they’ll lock chatbots with more barriers in the future but that’s ironic.