Share

Conversations and insights about the moment.


Nicholas KristofPatrick Healy

Patrick Healy, Deputy Opinion Editor

Nick, you’ve reported deeply for years about exploitation, abuse and trafficking of women and girls. Your latest column on deepfake nude videos showed us new ways that technology has become a vile weapon against them. What did you learn in reporting the piece that surprised you?

Nicholas Kristof, Opinion Columnist

What startled me the most was simply the failure of regulators, lawmakers and tech companies to show much concern for the humiliation of victims, even as sleazy companies post nonconsensual fake sex videos and make money on them. Women and girls are targeted, yet the response from the tech community has mostly been a collective shrug. Why should Google, whose original motto was “don’t be evil,” be a pillar of this ecosystem and direct traffic to websites whose business is nonconsensual porn?

Even when underage victims go to the police, there’s usually no good recourse. We’ve effectively armed predators and exploitative companies with artificial intelligence but denied victims any defense.

Patrick Healy

You write: “With just a single good image of a person’s face, it is now possible in just half an hour to make a 60-second sex video.” Is there any way people can protect themselves?

Nicholas Kristof

Some experts counsel girls or women to avoid posting images on public Instagram or Facebook pages. That strikes me as unrealistic. Some of the victims are prominent women whose images are everywhere — one deepfake site appropriated a congresswoman’s official portrait. Or sometimes an ordinary woman or girl is targeted by an ex-boyfriend or by a classmate, who will probably have photos already.

Because it’s so difficult for individuals to protect themselves, we need systemic solutions, like amending Section 230 of the Communications Decency Act so that there is less immunity for badly behaved tech companies. End impunity, and incentivize companies to police themselves.

Patrick Healy

Among the statistics that froze me was this one: “Graphika, an online analytics company, identified 34 nudify websites that received a combined 24 million unique visitors in September alone.” These numbers are enormous. What does this say to you about our society?

Nicholas Kristof

A generation ago, there was an argument that social networks were going to knit us together. In fact, I think we’ve become more atomized, with screen time substituting for people time. Some experts think that in an age of social isolation, porn is becoming an easy way to avoid the complexity and frustration of dealing with real people. Meanwhile, the casual cruelty we see on social media is paralleled by the cruelty we see in deepfake sites showing actresses, princesses, singers or politicians being raped.

It’s hard to view these exploitative, nonconsensual videos and not perceive misogyny — both in the videos and in a system that tolerates them and provides victims with no remedy.

Photograph by Larysa Shcherbyna/Getty Images



Source link

Posted In: