A study by researchers at the Stanford Internet Observatory has found that LAION-5B, one of the largest image datasets used to train AI systems like Stable Diffusion, contains thousands of instances of child sexual abuse material (CSAM).
Study shows AI image-generators being trained on explicit photos of children wsaw.com - get the latest breaking news, showbiz & celebrity photos, sport news & rumours, viral videos and top stories from wsaw.com Daily Mail and Mail on Sunday newspapers.
Exploitative photos of children found in AI training data bnd.com - get the latest breaking news, showbiz & celebrity photos, sport news & rumours, viral videos and top stories from bnd.com Daily Mail and Mail on Sunday newspapers.
Hidden inside the foundation of popular artificial intelligence image-generators are thousands of images of child sexual abuse, according to a new report that urges companies to take ac.