Comment by willis936

Comment by willis936 5 hours ago

3 replies

It sounds like we should be asking "why is it okay that the people training the models have CSAM?" It's not like it's legal to have, let alone distribute in your for-profit tool.

wongarsu 4 hours ago

If you crawl any sufficiently large public collection of images you are bound to download some CSAM images by accident.

Filtering out any images of beaten up naked 7 year olds is certainly something you should do. But if you go by the US legal definition of "any visual depiction of sexually explicit conduct involving a minor (someone under 18 years of age)" you are going to have a really hard time filtering all of that automatically. People don't suddenly look differently when they turn 18, and "sexually explicit" is a wide net open to interpretation.

wbl 5 hours ago

Read the sentence again. It doesn't claim the data set has CSAM but that it depicts victims. It also assumes that you need AI to see an example to draw it on demand which isn't true.