Graphika, a social analytics company studying online trends, revealed rapidly growing popularity this year for services leveraging AI">AI to digitally remove clothing from images without consent. Referrals on platforms like Reddit and Twitter to providers of these so-called “AI undressing” tools jumped over 2,400% from 2022.
The report warns this uptick indicates issues around synthetic non-consensual intimate images could dramatically escalate. Deepfake nude media produced without permission contributes to problems like sextortion, targeted abuse, and even child sexual exploitation according to child safety groups.
Distinguishing real from AI-generated explicit imagery is also becoming more difficult with advancements in neural networks. And while current services focus on pictures, video deepfakes using celebrity likenesses underscore risks ahead.
Given the vast volumes AI could scale, the traction of these nude image generators raises alarms around information integrity and individual rights. The study comes as EU lawmakers passed landmark AI regulations, though centralized content filtering faces hurdles against decentralized creation. With technology enabling new forms of misuse at clickspeed, the report suggests unintended consequences of undressing algorithms may overwhelm existing safeguards.
#deepfakes #AI #onlineharassment #contentmoderation #technology