4chan users who have made a game out of exploiting popular AI image generators appear to be at least partly responsible for the flood of fake images sexualizing Taylor Swift that went viral last month.
Graphika researchers—who study how communities are manipulated online—traced the fake Swift images to a 4chan message board that's "increasingly" dedicated to posting "offensive" AI-generated content, The New York Times reported. Fans of the message board take part in daily challenges, Graphika reported, sharing tips to bypass AI image generator filters and showing no signs of stopping their game any time soon.
"Some 4chan users expressed a stated goal of trying to defeat mainstream AI image generators' safeguards rather than creating realistic sexual content with alternative open-source image generators," Graphika reported. "They also shared multiple behavioral techniques to create image prompts, attempt to avoid bans, and successfully create sexually explicit celebrity images."
Read 16 remaining paragraphs | Comments
Ars Technica - All contentContinue reading/original-link]