The federalist

Deepfake Pornography Reveals Yet Another Risk Posed By Artificial Intelligence

At the end of January, popular Twitch live-streamer Brandon Ewing — better known as Atrioc — was exposed in a Now-deleted post Online for allegedly visiting a pornographic website specializing in the production of “deepfakes” Many of these online personalities were his Twitch colleagues and personal friends.

Atrioc was accused of consuming pornography of his colleagues and friends that was generated through artificial intelligence (AI) technology without these individuals’ consent. Atrioc issued an apology stream. admitted to the accusations, and claimed that he had located the source of deepfakes through Advertisements Another porn website. He also admitted that he was paying for videos where AI was used to superimpose his friends’ and colleagues likenesses onto the bodies or pornographic actors.

The resounding uproar online, originating with a video of Atrioc’s apology that has amassed millions of views, does appear to be justified. It is clear that he solicited and was willing to consume pornographic material in which the essence and business relationships of his friends and family were stolen and exploited.

It is a pernicious trend that AI technology has been used to create pornography without the consent or knowledge of the individual. Some refer to it online as “a form of pornography.” “free speech” and suggest its existence isn’t doing harm since the people pictured in the videos are often public figures who are only having their likenesses nonconsensually grafted onto sexual content and are not forced to actually participate in nonconsensual activity. 

Deepfake pornography has been compared to the rise of sexy porn. “revenge porn.” Retribution porn, which is the unconsensual release of sexual materials to humiliate or exact revenge upon another person while using prior consent and intimate as the genesis, involves mainly humans exploiting their past interactions. 

Deepfake pornography is, however, entirely made of whole cloth using advanced pattern recognition software. This software is capable of creating compromising situations in which the people involved were not present. Obviously, revenge porn isn’t a morally superior alternative to this, but the fact that these materials can be created through self-automating software ought to be deeply concerning.

It is not possible to pursue legal action against people who have seen their likeness in this manner. Karen Hao from the MIT Technology Review Write That “46 states have some ban on revenge porn, but only Virginia’s and California’s include faked and deepfaked media.” She continued:

There are a few civil and criminal laws that might be applicable in certain situations. If a victim’s face is pulled from a copyrighted photo, it’s possible to use IP law. And if the victim can prove the perpetrator’s intent to harm, it’s possible to use harassment law. But gathering such evidence is often impossible … leaving no legal remedies for the vast majority of cases.

Sure, deepfake pornography is akin to revenge porn in that both relate to the dissemination of lewd and compromising materials without an individual’s consent,


Read More From Original Article Here:

" Conservative News Daily does not always share or support the views and opinions expressed here; they are just those of the writer."
*As an Amazon Associate I earn from qualifying purchases

Related Articles

Sponsored Content
Back to top button
Available for Amazon Prime
Close

Adblock Detected

Please consider supporting us by disabling your ad blocker