Comment by rurp
>> they do what you tell them and what they are taught, and if 99% of their learning set associates Mario with prompts for Italian plumbers, that's what you'll get.
I thought that a lot of the issues were the opposite of this, where Google put their thumb on the scale to go against what the prompt asked. Like when someone would ask for a historically accurate picture of a US senator from the 1800s and repeatedly get women and non-white men. The training set for that prompt has to be overwhelmingly white men so I don't think it was just a matter of following the training data.