World

Democrats and Republicans have a common enemy: pornographic ‘deepfakes’

Last fall, Elliston Berry woke up to a barrage of text messages.

A classmate ripped photographs from her Instagram, manipulated them into fake nude photographs, and shared them with other teens from her school. They were online for nine months. She was 14 years old.

“I locked myself in my room, my academics suffered, and I was scared,” she said during a recent briefing.

She is not alone. So-called non-consensual intimate images (NCII) are often distributed as pornographic “deepfakes” using artificial intelligence that manipulates images of existing adult performers to look like the victims — targeting celebrities, lawmakers, middle- and high schoolers, and millions of others.

There is no federal law that makes it a crime to generate or distribute such images.

Elliston and dozens of other victims and survivors and their families are now urging Congress to pass a bill that would make NCII a federal crime — whether real or created through artificial intelligence — with violators facing up to two years in prison.

The “Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act” — or TAKE IT DOWN Act — unanimously passed the Senate on December 3.

The legislation was attached to a broader bipartisan government funding bill with support from both Republican and Democratic members of Congress, including a final push from Ted Cruz and Amy Klobuchar — two senators who are rarely if ever on the same side of an issue.

Elon Musk’s X platform was even involved in lobby efforts to support the legislation, including other actions on child safety.

But his pressure campaign against that government funding bill prompted Republicans to strike the TAKE IT DOWN Act language from the measure altogether. Efforts to revive the spending bill did not include it.

Another measure from Rep. Alexandria Ocasio-Cortez would allow victims of “digital forgery” to file lawsuits to stop them. The congresswoman has spoken out about her own experience as a victim.

The “Disrupt Explicit Forged Images and Non-Consensual Edits” Act — or DEFIANCE Act — marked the first-ever attempt to ensure federal protections targets of nonconsensual deepfakes.

Legislation passed the Senate this summer.

“Victims of nonconsensual pornographic deepfakes have waited too long for federal legislation to hold perpetrators accountable,” the congresswoman said in a statement at the time.

  • For more: Elrisala website and for social networking, you can follow us on Facebook
  • Source of information and images “independent”

Related Articles

Leave a Reply

Back to top button

Discover more from Elrisala

Subscribe now to keep reading and get access to the full archive.

Continue reading