Comment by 3np
How about encouraging self-harm, even murder and suicide?
https://www.npr.org/2024/12/10/nx-s1-5222574/kids-character-...
https://apnews.com/article/chatbot-ai-lawsuit-suicide-teen-a...
https://www.euronews.com/next/2023/03/31/man-ends-his-life-a...
Can this not occur on Youtube/Roblox and other places where kids using tablets go? Mass generalizations about what I observe -> I don't see why/how parents do the mental gymnastics that tablets are acceptable but AI is to be feared. There's always going to be articles like this, it's a big world everything will have a dark side if you search for it. It's life. [Actually, I think a lot of parents are willing to accept/ignore the risks because tablets offer too great of a service. This type of AI simply won't entertain/babysit a kid long enough for parents to give into it.]
I have a 6 year old FWIW, I'm not some childless ignoramus I just do my risk calcs differently and view it as my job to oversee their use of a device like this. I wouldn't fear it outright because of what could happen. If I took that stance, my kid would never have any experiences at all.
Can't play baseball, I read a story where kid got hit by a bat. Can't travel to Mexico, cartels are in the news again. Home school it is, because shootings. And so on.