Anna McAdams has always kept a close eye on her 15-year-old daughter Elliston Berry’s online life. So it was difficult to accept what happened 15 months ago, on the Monday morning after the reunion in Aledo, Texas.
A classmate took an Instagram photo of Elliston, ran it through an artificial intelligence program that appeared to remove her dress, then sent the digitally altered image to Snapchat.
“She came into our room crying, just saying, ‘Mom, you won’t believe what just happened,'” McAdams said.
Last year there were more than 21,000 deepfake porn videos online – up more than 460% from the previous year. The manipulated content is proliferate on the Internet as websites make ominous speeches – like a service that asks, “Do you have anyone to undress?”
“I took PSAT tests and volleyball games,” Elliston said. “And the last thing I need to focus on and worry about is my fake nudes circulating around the school. These images have been floating around Snapchat for nine months.”
In San Francisco, Chief Deputy City Attorney Yvonne Mere was beginning to hear stories similar to Elliston’s – which struck a chord with her.
“That could easily have been my daughter,” Mere said.
The San Francisco City Attorney’s Office is now pursue the owners of 16 websites that create “nude deepfakes,” where artificial intelligence is used to turn non-explicit photos of adults and children into pornography.
“This case is not about technology. It is not about AI. It is about sexual abuse,” Mere said.
Those 16 sites saw 200 million visits in the first six months of the year alone, according to the lawsuit.
City Attorney David Chiu said the 16 sites covered in the lawsuit are just the beginning.
“We know of at least 90 of these websites. So it’s a big universe and it needs to be stopped,” Chiu said.
Texas Republican Sen. Ted Cruz is co-sponsoring another angle with Minnesota Democratic Sen. Amy Klochubar. THE Disposal Act would force social media companies and websites to remove non-consensual pornographic images created with AI.
“This places a legal obligation on any technology platform: You must remove it and remove it immediately,” Cruz said.
The bill passed the Senate this month and is now part of a larger plan. government funding bill pending a vote in the House.
In a statement, a Snap spokesperson told CBS News: “We care deeply about the safety and well-being of our community. Sharing nude images, including of minors, whether real or AI-generated, is a blatant violation of our Community Guidelines. We have effective mechanisms for reporting this type of content, which is why we are so disheartened to hear stories from families who feel their concerns have not been addressed. We have a zero tolerance policy for this type of content and, as stated in our latest transparency. report, we act quickly to address it once reported.”
Elliston says she’s now focused on the present and urging Congress to pass the bill.
“I can’t go back and do what he did again, but I can stop this from happening to other people,” Elliston said.