Conservative News Daily

Warning: Report Exposes High Levels of Sexual Content Shown to Minors on Instagram

The report highlighted ⁤concerns about Instagram’s algorithm, which⁣ appears to aggressively⁣ recommend sexual content to accounts registered as 13-year-olds within ⁢minutes ⁤of logging in. ⁤This phenomenon was documented by a joint examination from The Wall Street Journal and a professor from Northeastern University ​through tests​ conducted on‌ dummy Instagram accounts. Despite past announcements from Meta⁤ (Instagram’s​ parent company) ​to limit such content for ⁤minors, this investigation suggests⁤ these ⁤commitments are either ineffective or unimplemented,⁣ with Instagram’s content ⁤recommendation system​ showing rapid and concerning exposure​ to sexual content ⁢compared to ⁣other platforms like Snapchat and TikTok.


A disturbing new report is shedding some potentially damning light on one of the world’s most popular social media platforms.

In a joint effort with Laura Edelson, a computer science professor at Northeastern University, The Wall Street Journal reported that the Meta-owned Instagram has an odd propensity to aggressively push sexual content to users as young as 13 years old.

“Instagram regularly recommends sexual videos to accounts for teenagers that appear interested in racy content, and does so within minutes of when they first log in,” the  news outlet reported.

WSJ collected a number of examples, and they go well beyond what one might expect from “Instagram models.”

WARNING: The following contains descriptions that some readers will find offensive or disturbing.

Trending:
CBS Forced to Delete Segment After Most Embarrassing Biden Reporting in History Exposed On-Air

Some examples of the sexual content WSJ reported on included:

  • A video of a woman pulling her underwear off (from a first-person perspective, so you see nothing but feet and underwear), alongside a caption of “When he says he’s into small Asian white sophomore nursing students.”
  • Two images making coded references to masturbation (alongside an image of a woman in lingerie) and unprotected sex (alongside an image focused on a woman’s buttocks), respectively.
  • A short video of a woman in a skirt provocatively spreading her legs (nothing is shown) to the caption “I’ve learned that 40 is…”
  • A young woman in lingerie taking a selfie on a bed.
  • A provocatively zoomed-in image of a woman in a red dress.

Edelson and the WSJ investigated this concerning phenomena by creating new, dummy Instagram accounts, and setting the age on the profile as 13. Those dummy accounts would then just “watch” Instagram Reels, which are the platform’s curated offerings. These tests were performed primarily from January through April. The Journal performed an additional test in June.

“Instagram served a mix of videos that, from the start, included moderately racy content such as women dancing seductively or posing in positions that emphasized their breasts,” The Journal reported. “When the accounts skipped past other clips but watched those racy videos to completion, Reels recommended edgier content.

Have you or do you use Instagram?

“Adult sex-content creators began appearing in the feeds in as little as three minutes.

“After less than 20 minutes watching Reels, the test accounts’ feeds were dominated by promotions for such creators, some offering to send nude photos to users who engaged with their posts.”

That is a swift and aggressive escalation of sexually charged offerings.

For Instagram parent company Meta (formerly Facebook), these are familiar issues.

As CNBC noted back in January, Meta came out and said that it would restrict the sort of content that minors would see on its platforms. Those promises came as the platform came under scrutiny for being addictive and bad for the mental health of young people.

Related:
UFC Star Michael Chandler Praises US on Memorial Day, Will 'Proudly' Be Patriotic in Fight Against Ireland's Conor McGregor

Those restrictions — if they were ever actually even put in place — clearly aren’t working, and it very much seems like a Meta problem.

As the Wall Street Journal noted, similar tests conducted on social media platforms Snapchat and TikTok did not yield results that were anywhere near as aggressive as Instagram.

Meta, for its part, shrugged off this report as much ado about nothing.

“This was an artificial experiment that doesn’t match the reality of how teens use Instagram,” Meta representative Andy Stone told the outlet.

Even if these findings are the nothingburger that Stone makes them out to be, it should still be a sobering reminder of how important it is for parents to be aware of what sort of content their children are consuming.


Truth and Accuracy

Submit a Correction →


We are committed to truth and accuracy in all of our journalism. Read our editorial standards.

Tags:
Children, Instagram, Meta, Pornography, Social media

Share

Bryan Chai has written news and sports for The Western Journal for more than five years and has produced more than 1,300 stories. He specializes in the NBA and NFL as well as politics.

Bryan Chai has written news and sports for The Western Journal for more than five years and has produced more than 1,300 stories. He specializes in the NBA and NFL as well as politics. He graduated with a BA in Creative Writing from the University of Arizona. He is an avid fan of sports, video games, politics and debate.
Birthplace
Hawaii
Education
Class of 2010 University of Arizona. BEAR DOWN.
Location
Phoenix, Arizona
Languages Spoken
English, Korean
Topics of Expertise
Sports, Entertainment, Science/Tech



Conversation



" Conservative News Daily does not always share or support the views and opinions expressed here; they are just those of the writer."
*As an Amazon Associate I earn from qualifying purchases

Related Articles

Sponsored Content
Back to top button
Available for Amazon Prime
Close

Adblock Detected

Please consider supporting us by disabling your ad blocker