Skip to content

Google and X lag peers in addressing non-consensual explicit images, lawmakers say

Google, X and Discord and other tech giants could be doing more to address non-consensual explicit images online, according to a letter from a group of US senators.

A group of senators is urging nearly a dozen leading tech companies to do more to address...
A group of senators is urging nearly a dozen leading tech companies to do more to address non-consensual, explicit images.

Google and X lag peers in addressing non-consensual explicit images, lawmakers say

The letter criticizes nearly a dozen tech firms for their lack of participation in two programs that make it easier for people to request the removal of non-consensual explicit images and videos from the internet.

The programs are voluntary, but they already count other internet giants, such as Meta, Snap, TikTok and PornHub as participants. And the letter comes as both lawmakers and tech leaders face pressure to do more to combat non-consensual sexual images, sometimes known as revenge porn, especially as artificial intelligence makes it easier to create and spread such content.

This year alone, women around the world were targeted by AI-generated pornographic images, ranging from popstar Taylor Swift to high school girls. And while nine US states currently have laws against the creation or sharing of non-consensual deepfake images, none exist on the federal level — limiting the options for victims of this form of harassment who wish to seek help or accountability.

Friday’s letter, shared exclusively with CNN, is addressed to the chief executives of 11 tech companies: X, Google parent company Alphabet, Amazon, Match, Zoom, Pinterest, Discord, OpenAI, Twitch, Microsoft and Patreon.

It urges them to join the National Center for Missing and Exploited Children’s “Take it Down” program, which helps people remove nude or sexually explicit images or videos of children from online platforms, as well as the Revenge Porn Helpline’s “StopNCII” initiative, which helps adults remove explicit images that were shared online without their consent. Both programs let users create a unique numerical code for an image they want to remove, which participating platforms can then easily use to search their sites and remove the image.

“By increasing participation in these programs, companies can take actionable steps to stop the life-altering impact that the (non-consensual intimate imagery) has on the life, career and family of those affected,” the letter states. The letter was spearheaded by Democratic Sen. Jeanne Shaheen and Republican Sen. Rick Scott, and co-signed by eight other senators.

Most of the companies named in the letter have policies against the creation or sharing of non-consensual, explicit images, and in some cases offer their own ways for users to report or request the removal of such content. Google also recently announced that aim to keep such content from appearing near the top of search results.

But the benefit of joining the group is that users need to submit only one removal request that is directed to all the participating platforms, rather than having to contact each individual company one-by-one.

The fight to address non-consensual explicit images and deepfakes has received rare bipartisan support. A group of teens and parents who had been affected by AI-generated porn testified at a hearing on Capitol Hill, where Republican Sen. Ted Cruz introduced a bill — supported by Democratic Sen. Amy Klobuchar and others — that would make it a crime to publish such images and require social media platforms to remove them upon notice from victims.

The tech industry is under scrutiny for not participating in programs designed to combat the spread of non-consensual explicit images, with Google's parent company Alphabet and other notable firms already engaged. However, the use of artificial intelligence has made it easier to create and distribute such content, making it a pressing business issue that requires immediate attention.

Read also:

Comments

Latest