In this week’s Inweekly, we have our interview with the Stevenson family, whose daughter, Lucy Adams Stevenson, was targeted by a high school student who created fake nude photos of her female classmates using a smartphone app. They want state and federal lawmakers to close the loophole that allows “nudity” apps to target minors. Read Battling Fake Nude Images in Schools
PART OF A BIGGER STORY: Last month, “60 Minutes” reported that school systems across the country are grappling with a disturbing new phenomenon: “nudify” websites that transform innocent photos into realistic-looking nude images. These AI-powered tools, openly available on the internet, are being weaponized as a new form of bullying.
According to the show, nearly 30 similar cases have been reported in U.S. schools over the past 20 months, with more occurring worldwide. The website Clothoff alone received over 3 million visits last month, according to network analysis firm Graphika. These sites operate openly on the regular internet, not hidden on the dark web, with prices ranging from $2 to $40 per image after an initial free transformation.
Despite claims of age verification and consent requirements, the websites implement no meaningful safeguards. Instead, they employ sophisticated networks to evade detection, including routing payments through fake businesses selling everything from flowers to beekeeping lessons. Even the company’s listed address in Buenos Aires proved to be bogus, and its supposed CEO appears to be AI-generated.
Kolina Koltai, a senior researcher at Bellingcat specializing in AI misuse, told “60 Minutes” host Anderson Cooper, “There is a really inherent shadiness that’s happening. They’re not being transparent about who owns it. They’re obviously trying to mask their payments.”
IN FLORIDA: Last December 2023, two Miami teenagers were arrested for allegedly creating and distributing AI-generated nude images of their classmates. This incident appears to be the first known case of criminal charges being filed for sharing AI-generated explicit photos without consent.
The two boys, ages 13 and 14, were arrested after allegedly using an AI app to create explicit images of fellow students aged 12 and 13. They faced third-degree felony charges under a 2022 Florida law that criminalizes sharing “any altered sexual depiction” without consent. While the specific AI application used wasn’t identified in the reports, the arrest report indicated the boys shared the images between themselves.
The article highlights the growing challenge of regulating AI technology that can create highly realistic fake images. As Carrie Stevenson pointed out, “These things look very much like a real person. You could not tell the difference between a real photo and a fake. They’re very, very realistic, and that’s problematic.”
The Stevenson intend to continue their advocacy efforts, hoping to prevent similar incidents in the future. They emphasize the need for laws that better protect individuals, especially minors, from having their images manipulated without consent. Read Battling Fake Nude Images in Schools


