The federalist

Beware: Your Daughter’s Face Vulnerable to ‘Deepfake’ Porn



Francesca’s Life Changed Forever

Fourteen-year-old Francesca’s life ​has changed forever.

An ‍email sent by her high school’s principal to her family on Oct. 20, 2023, notified them that Francesca was one of more than ⁢30 students whose images had been digitally altered to appear ⁣as synthetic sexually explicit media — sometimes referred to as “deepfake” pornography.

Francesca ⁢Speaks Out

Speaking ​to media, Francesca shared how she felt betrayed, saying, “We need to⁢ do something about ​this because it’s not OK, and⁤ people are making it seem like it is.” She’s right — something must be ⁤done.

The issue of image-based sexual abuse (IBSA) — whether it manifests as nonconsensual AI⁣ or “deepfake” content, nonconsensual recording or sharing of explicit imagery, extortion or blackmail based‍ on images, recorded sexual abuse, or its ​many other manifestations — can feel like something that happens to⁤ “others.” It’s‍ a headline we scroll past. It can feel distant‍ from our own lives. But that’s far from⁢ the truth.

  • If anyone has ever taken a ⁣video⁤ or photo of you and ⁢posted it online, even an innocent family photo or professional headshot, that’s all it takes.
  • You and your loved ones are personally at risk for‍ having⁤ your images turned into sexually explicit “synthetic,” “nudified,” or “deepfake” content.

It doesn’t take a tech genius on the ⁤dark web to do this, as the code and tools to make this are free on open-source,⁣ popular websites like Microsoft’s GitHub and are shared widely online. In fact, GitHub ⁢hosts the ⁤source code to the software used to create 95 percent of sexual deepfakes despite ⁣being notified of the⁢ exploitative code by anti-exploitation advocates.

These kinds of images can be created in less time than it⁤ takes to brew a cup of coffee.

Even ⁤if people don’t know how to create these images, they can openly solicit and pay for others to do so on websites like Reddit, where entire communities exist based on trading and ‌creating nonconsensual explicit material.

And here’s the kicker, ​these images aren’t some sloppy photoshop ‌of a face onto a body, a la 1990. Top executives at one of the most innovative technology companies in the world have told us that they themselves typically cannot⁢ tell if an image is synthetic,⁤ artificial pornography ⁤or not. There isn’t some easy watermark that separates fake‍ versus real.

And of course, sexual imagery is often consensually created and shared⁤ between romantic partners and then shared nonconsensually later, sometimes called ⁣revenge pornography. A 2017 survey found ⁣that one in eight participants had been targets of this distribution, or threat of distribution, without their consent. Not to mention the ‌countless numbers of adult sex trafficking or abuse survivors who have their‍ exploitation‌ recorded. These problems are also ⁤rampant within the pornography industry as we’ve seen on Pornhub, XHamster, and other pornography ‌sites.

Victims can face an uphill battle, and many ‌try to face‍ this alone. Most victims of IBSA (73 percent) didn’t turn to anyone ⁢for‌ help. At most, you can ​contact the social media company ​and ask them to take it‍ down, with mixed results, or ‍maybe your state law could ⁤hold the person who uploaded the image liable. But this doesn’t

How can⁣ nonconsensual AI​ or “deepfake” content ​be addressed ⁣and regulated to combat ‍image-based sexual abuse?

Francesca’s Life ​Changed Forever: A Wake-Up‌ Call on Image-Based Sexual Abuse

Fourteen-year-old Francesca’s life has changed ‌forever. In a⁤ shocking turn of events, an email sent by her high school’s principal to her family on October 20,‌ 2023,⁢ revealed that Francesca was⁣ one‍ of more than 30 students whose images had been digitally altered ⁤to appear as synthetic sexually explicit media, commonly ⁣known as ​”deepfake” ⁢pornography.

This incident has highlighted‌ the urgency of addressing the issue of​ image-based ​sexual abuse (IBSA), and Francesca herself has stepped forward to speak out. In an interview with the media, she ‌expressed her sense of betrayal and emphasized the need to take ‍action.‌ “We ‌need ⁣to do something about‌ this because it’s not OK, and people are making⁤ it seem like‌ it is,”‍ she said. Francesca is absolutely right, and it ⁢is clear ⁤that immediate action must be taken.

IBSA encompasses various forms, such as nonconsensual ​AI or “deepfake” content, nonconsensual recording or​ sharing ⁤of explicit imagery, extortion ​or blackmail based on images, and recorded sexual ‍abuse.⁢ It may seem⁣ like an issue⁤ that‍ only affects others, a headline ‌we scroll past without realizing its impact. However,​ the truth is that ​each ‌one of⁢ us is personally at risk.

Do not underestimate the power of a⁤ single​ photo or video posted ​online. Even ‌innocent ⁢family photos or professional headshots can⁣ be exploited and turned into​ sexually explicit “synthetic,” “nudified,” or “deepfake” ⁣content. This can be done without‌ requiring advanced technical skills, ⁣as the necessary code and tools are freely accessible⁤ on open-source platforms like‌ Microsoft’s GitHub. Shockingly, despite being notified by ​anti-exploitation​ advocates, GitHub⁤ continues to host the source⁣ code for the software used to create 95% of sexual deepfakes.

It ⁢is alarming to realize that these images can be generated⁣ in a matter of minutes, often less time than it takes to brew a cup of coffee. Furthermore, even‌ if someone lacks the⁤ ability to⁤ create‍ such content themselves, they can easily ⁢find and pay others to do it for ‌them‌ on platforms ​like⁢ Reddit. Entire communities exist⁤ solely ⁢for the purpose of trading and​ creating nonconsensual ​explicit material.

Therefore, it is ‍vital ‌that ​we confront ‍this issue​ head-on. Measures need ⁢to be implemented to combat image-based sexual abuse, ⁢including strict ‍regulations against the creation, distribution, and consumption of deepfake pornography. Technology⁤ companies should ⁣also take responsibility by actively monitoring and removing such content from their platforms. Additionally, educational programs should‍ be developed and promoted to raise awareness among⁣ young people and their families about the risks and consequences of ⁣image-based sexual abuse.

Francesca’s traumatic experience serves ‌as a wake-up call for⁤ us ‍all.​ It is our collective responsibility to protect ourselves and our⁢ loved ⁢ones from the horrors of image-based sexual abuse. Only​ through concerted ​efforts and a commitment to ‌change can we create a safer‍ digital environment for⁢ everyone. Let us not turn a blind eye⁣ any longer; ⁣let us take action now.



" Conservative News Daily does not always share or support the views and opinions expressed here; they are just those of the writer."
*As an Amazon Associate I earn from qualifying purchases

Related Articles

Sponsored Content
Back to top button
Available for Amazon Prime
Close

Adblock Detected

Please consider supporting us by disabling your ad blocker