-
Mexican Mafia leader offered protection to El Chapo, prosecutors say - 3 hours ago
-
H-E-B Food Recalls: Full List of Products Impacted - 4 hours ago
-
Explosions Heard in Ukraine’s Capital - 7 hours ago
-
TikTok Says App May Be ‘Forced to Go Dark’ In New Update - 9 hours ago
-
‘This has been really devastating’: Inside the lives of incarcerated firefighters battling the L.A. wildfires - 10 hours ago
-
Joe Biden’s Average Approval Compared to Donald Trump Compared: Poll - 14 hours ago
-
Commentary: Ashes still drifting through L.A. are a valuable reminder - 17 hours ago
-
CNN Ordered to Pay at Least $5 Million for Defaming Security Contractor - 17 hours ago
-
Unique Moment Space Rock Strikes Driveway Captured on Camera - 20 hours ago
-
These Los Angeles firefighters lost their homes in the Eaton fire - 23 hours ago
San Francisco sues 16 websites that create AI-generated nudes
San Francisco City Atty. David Chiu announced Thursday that his office is suing the operators of 16 A.I.-powered “undressing” websites that help users create and distribute deepfake nude photos of women and girls.
The lawsuit, which city officials said was the first of its kind, accuses the websites’ operators of violating state and federal laws that ban deepfake pornography, revenge pornography and child pornography, as well as California’s unfair competition law. The names of the sites were redacted in the copy of the suit made public Thursday.
Chiu’s office has yet to identify the owners of many of the websites, but officials say they hope to find their names and hold them accountable.
Chiu said the lawsuit has two goals: shutting down these websites and sounding the alarm about this form of “sexual abuse.”
On these websites, users upload photos of fully clothed real people, then artificial intelligence alters the image to simulate what the person would look like undressed. The sites create “pornographic” images without the consent of the persons in the photo, Chiu said during a Thursday morning press conference.
According to the lawsuit, one of websites promotes the nonconsensual nature of the images, stating, “Imagine wasting time taking her out on dates, when you can just use [redacted website name] to get her nudes.”
The availability of open source A.I. models means that anyone can access and adapt A.I.-powered engines for their own purposes. One result: sites and apps that can generate deepfake nudes from scratch or “nudify” existing images in realistic ways, often for a fee.
Deepfake apps grabbed headlines in January when fake nude images of Taylor Swift circulated online, but many other, far less famous people were victimized before and after the pop star. “The proliferation of these images have exploited a shocking number of women and girls across the globe,” from celebrities to middle school students, Chiu said.
Through its investigation, the city attorney’s office found that the websites in question were visited more than 200 million times in just the first six months of 2024.
Once an image is online, it’s very difficult for victims to determine what websites were used to “nudify” their images because these images “don’t have any unique or identifying marks that link you back to websites,” said Yvonne R. Meré, San Francisco’s chief deputy city attorney.
It’s also very difficult for victims to remove the images from the internet.
Earlier this year, five Beverly Hills eighth-graders were expelled for creating and sharing deepfake nude images of 16 eighth-grade girls, superimposing the girls’ faces onto A.I.-generated bodies.
Chiu’s office said it has seen similar incidents at other schools in California, Washington and New Jersey.
“These images are used to bully, humiliate and threaten women and girls,” Chiu said. “The impact on victims has been devastating on their reputations, their mental health, loss of autonomy and, in some instances, causing individuals to become suicidal.”
Source link