The epoch times

Safety experts warn against parents posting children’s photos online.

With children spending⁢ an increasing amount​ of time on the internet and many uploading photos to their social media accounts, sexual predators continue to steal these images to produce child⁣ sexual abuse material ⁤(CSAM).

Further compounding the ⁢proliferation of CSAM is the easy access to artificial intelligence⁢ (AI), and law enforcement agencies and child protective organizations are seeing a dramatic rise in AI-generated CSAM.

Yaron Litwin is a digital safety expert and chief marketing officer at Netspark, the ⁣company behind a program called CaseScan that identifies AI-generated CSAM online, aiding law enforcement agencies in their investigations.

Mr. Litwin‌ told The ‌Epoch Times⁣ he recommends that parents and teens not‍ post photos ⁣on ⁤any public forum and that parents talk to their‍ children about the potential dangers of revealing personal information online.

“One of our‍ recommendations is to be a little more⁢ cautious with images‌ that are being posted online and ‌really try to keep those within⁣ closed networks, where there are only people that you know,” ‍Mr. Litwin said.

The American‌ Academy​ of ‍Child and Adolescent Psychiatry⁣ said ‌in​ 2020‍ that on average, children ages 8 to 12⁣ spend four to six hours a day watching‍ or ⁣using screens, and teens⁢ spend ⁤up​ to 9‍ hours a‍ day on their devices.

Exploitative Content Expanding

The amount of CSAM online has gone up exponentially since generative AI became mainstream⁢ at the start of 2023, Mr. Litwin ⁤said. The problem is serious enough that all 50 states ⁢have asked ⁢Congress to ‌institute a commission to study the problem of AI-generated CSAM, he said.

“There’s definitely a​ correlation between the increase in AI-generated CSAM and when OpenAI⁣ and DALL-E⁤ and all these generative AI-type platforms launched,” Mr. Litwin said.

The FBI recently warned ⁣the public about the rise⁢ of AI-generated sexual abuse materials.

“Malicious actors use content manipulation technologies ⁢and services to exploit photos and videos—typically captured from an individual’s social media account, open internet, or requested from the victim—into sexually-themed images that appear true-to-life in ​likeness to a victim, then circulate them on ⁣social media, public forums, or​ pornographic websites,” said the FBI ⁤in a recent​ statement.

Mr. Litwin said that ⁤to really protect children from online predators,⁤ it ‍is important for parents and guardians to clearly discuss the potential dangers of posting photos and ⁣talking⁢ to strangers online.

An artificial⁢ intelligence (AI) logo blended with four fake Twitter accounts ⁢bearing⁢ profile pictures​ apparently generated by artificial intelligence software taken in Helsinki, Finland, on ‍June 12, 2023. (Olivier Morin/AFP via Getty Images)

Dangers of Generative ​AI

Roo Powell, founder of Safe from Online Sex Abuse (SOSA),​ told The Epoch Times that because predators can​ use the image of a fully-clothed child to create an explicit image using ⁣AI, it is best not to post any​ images of children online, even as toddlers, she said.

“At SOSA,​ we encourage parents not⁣ to publicly share images or videos of their children in diapers ‌or having ​a ⁢bath. Even‍ though their ​genitals may technically⁤ be covered, perpetrators can save this content for their ‍own gratification, or can use AI to ‍make it explicit ‍and then⁤ share that ⁤widely,” Ms. Powell said in an email.

While some people ⁤say AI-generated⁣ CSAM is ‍not as harmful as images depicting the sexual abuse of⁢ real-life ‌children, many believe it is worse.

AI-generated CSAM⁣ is produced much more quickly than conventional images, subsequently inundating law enforcement with even more abuse referrals,‍ and experts ⁢in the ⁣AI and online ⁤parental control space expect the⁤ problem to only get worse.

In other cases, the ⁣AI-generated CSAM image‍ could be created from a photo​ taken​ of a⁣ real-life child’s social media account,‌ which is altered to be sexually explicit⁣ and thus endangers those otherwise unvictimized children, as well as their parents.

In worst-case scenarios, bad actors use images of real ⁣victims⁢ of child sexual abuse as a base to‍ create computer-generated images. They can use the original photograph as an initial input, which is then altered according to prompts.

In some cases, ⁤the photo’s​ subject can be ‌made to look younger or older.

Using ⁤Good ⁣AI ‌to Fight Bad AI

Prior ​to the wide availability of AI, editing and generating⁢ images required skills and knowledge of ⁢image‍ editing software programs.⁣ However, AI has‌ made it so⁣ quick and easy that even amateur users can generate life-like images.

Netspark is leading the fight against AI-generated CSAM with CaseScan, its own AI-powered cyber safety tool,‍ said Mr.⁤ Litwin.

“We definitely believe that to fight⁣ the bad AI, it’s going to come through AI for good, and that’s where we’re focused,” he‍ said.

Law enforcement agencies must ‌go through massive amounts‌ of images each day and are often unable to get through ⁢all of the CSAM reports⁣ in a timely manner, said Mr. Litwin, but this is exactly where CaseScan is able to⁤ assist investigators.

Unless ​the police departments are using AI-centered solutions,‌ police‍ spend an extensive amount of time ‌assessing if the child in a photo is a fake AI-generated⁢ or an ‌actual sexual ⁤abuse ⁣victim. Even before‌ AI-generated content, ‍law enforcement and child safety organizations⁢ were overwhelmed by the immense volume of CSAM reports.

Under U.S. law, AI-generated CSAM is treated the⁣ same as CSAM of real-life children, ⁢but‍ Mr. Litwin said he does not know ​of ⁢any‍ AI-generated CSAM case that has been prosecuted, so there is ⁢no precedent yet.

“I think today it’s hard to take to court, it’s hard to create​ robust cases. And my guess is that that’s one reason that we’re seeing so much‌ of it,” he said.

To prosecute the



" Conservative News Daily does not always share or support the views and opinions expressed here; they are just those of the writer."
*As an Amazon Associate I earn from qualifying purchases

Related Articles

Sponsored Content
Back to top button
Available for Amazon Prime
Close

Adblock Detected

Please consider supporting us by disabling your ad blocker