Congress launches bipartisan effort against explicit AI deepfakes – Washington Examiner

A bipartisan group⁣ of lawmakers has introduced legislation to⁤ combat the distribution of nonconsensual, sexually ‌explicit deepfakes on various websites. These deepfakes⁤ involve the use of artificial intelligence to create realistic and⁣ explicit depictions of individuals without ‌their consent, including children. Advocacy groups ⁤have been pushing⁣ Congress to address this issue, with legislation proposed to hold parties ​accountable ‍for distributing deepfake pornography and provide victims with the ability to seek financial settlement. Various bills have been put forward‌ by⁤ both ⁤Republicans and ⁢Democrats to address the issue, with a ⁣focus on criminalizing the distribution of nonconsensual pornography⁢ and ⁢creating federal civil remedies for victims. Some lawmakers are⁣ concerned ⁢about tech companies using legal loopholes to avoid liability, particularly citing⁤ Section 230 of the Communications Decency Act. ⁢The push to address the proliferation of deepfakes comes as the⁣ election approaches, highlighting the need for legislative action to‌ combat ⁣this growing crisis.




Inside Congress’s bipartisan effort against sexually explicit AI deepfakes

A group of bipartisan lawmakers has introduced legislation aimed at combating the horde of websites distributing nonconsensual, sexually explicit deepfakes.

Over the last few months, Republicans and Democrats have floated multiple bills that would hold parties accountable for the distribution of deepfake pornography, as well as give victims the ability to seek financial settlement.

The term “deepfake” refers to images or videos that depict individuals in false situations, often using artificial intelligence. In 2019, a study from Deeptrace Labs found that, of all deepfake videos, 96% were nonconsensual pornography. With the use of these websites, anyone can use generative AI to develop realistic and explicit depictions of another person without their consent, even children. What once required hundreds of images and computer editing skills to create now requires one or two photos and a cellphone.

Currently, no federal laws stand in the way of these websites.

“There are now hundreds of apps that can make non-consensual, sexually explicit deep fakes right on your phone,” Senate Judiciary Chairman Dick Durbin (D-IL) told Politico in May. “Congress needs to address this growing crisis as quickly as possible.”

For over a year, advocacy groups have pushed Congress to act against these malicious websites as numerous deepfakes have surfaced of prominent public figures such as Taylor Swift and Rep. Alexandria Ocasio-Cortez (D-NY).

Herself personally affected, Ocasio-Cortez worked with Durbin in spearheading the DEFIANCE Act. This piece of legislation hopes to “stop the proliferation of nonconsensual, sexually-explicit deepfakes” by creating a “federal civil remedy” for victims. Lawsuits would be “enforceable against individuals who produced or possessed [deepfakes] with the intent to distribute” them against the will of the depicted.

Republicans have taken a different yet complementary approach, focusing on the criminalization of these forgeries. Rep. Nancy Mace (R-SC) has introduced a pair of bills meant to increase fines for the distribution of nonconsensual pornography from $150,000 to $500,000.

Sens. Ted Cruz (R-TX) and Amy Klobuchar (D-MN) have proposed the bipartisan TAKE IT DOWN Act, which would “criminalize the publication” as well as the “threat to publish” nonconsensual AI deepfakes. The bill would also require all websites and social media to remove the content from their feed in an effort to minimize distribution.

Some lawmakers, however, are worried that tech companies will refer to Section 230 of the Communications Decency Act in search of a loophole. The section, passed in 1996, protects tech giants from any liability associated with the content of their users. Carrie Goldberg, an attorney known for representing many of Harvey Weinstein’s accusers, has called for the dissolution of this section altogether, believing it to put the concerns of Big Tech over those of internet victims.

“The best way to handle so many harms that are happening on platforms is for the platforms to be themselves sharing in the costs and the liability,” Goldberg told the Hill.

As the election approaches, legislative mechanisms have seemingly creaked into a near standstill as many lawmakers hope to retain their seats. Durbin, when introducing the DEFIANCE Act to the Senate floor, saw it quickly shot down by Sen. Cynthia Lummis (R-WY) despite her support for the TAKE IT DOWN Act. Lummis worried its language was too broad, potentially damaging online privacy and innovation while ultimately neglecting the victims.

CLICK HERE TO READ MORE FROM THE WASHINGTON EXAMINER

Despite inevitable head-butting, overall support is overwhelming from both sides.

“Victims of nonconsensual pornographic deepfakes have waited too long for federal legislation to hold perpetrators accountable,” Ocasio-Cortez said. “Congress needs to act to show victims that they won’t be left behind.”



" Conservative News Daily does not always share or support the views and opinions expressed here; they are just those of the writer."
*As an Amazon Associate I earn from qualifying purchases

Related Articles

Sponsored Content
Back to top button
Available for Amazon Prime
Close

Adblock Detected

Please consider supporting us by disabling your ad blocker