3 Min After New Account, Instagram Serves Sex Content to 13-Year-Old Boys: Analysis
Pe=”radio” name=”ff-author-tab-44670” value=”share-tab” id=”share-tab-radio-44670″>
Pe=”radio” name=”ff-author-tab-44670” value=”share-tab” id=”share-tab-radio-44670″>
C. Douglas Golden is a writer who splits his time between the United States and Southeast Asia. Specializing in political commentary and world affairs, he’s written for Conservative Tribune and The Western Journal for four years. Aside from politics, he enjoys spending time with his wife, literature (especially British comic novels and modern Japanese lit), indie rock, coffee, Formula One and football (of both American and world varieties).
Parents beware: Instagram is now the most insidious tool of the last sickening pangs of the sexual revolution.
That’s saying something, considering that Meta’s picture- and video-sharing social media platform was already known to be a digital hovel of sleaze and obscenity. Things are somehow worse than imagined, however, according to a new report from The Wall Street Journal. What’s worse, Meta apparently knows this thanks to its own internal research.
Tests run by the publication and Northeastern University computer science professor Laura Edelson set up accounts on Instagram with the age of the user listed as 13.
These accounts then “watched” the “Reels” — curated video clips fed to the user by the platform’s algorithm — recommended to them. The type of Reel that was served up to user depended on what videos that user lingered on the longest.
“Instagram served a mix of videos that, from the start, included moderately racy content such as women dancing seductively or posing in positions that emphasized their breasts,” the June 20 report read.
“When the accounts skipped past other clips but watched those racy videos to completion, Reels recommended edgier content.”
“Adult sex-content creators began appearing in the feeds in as little as three minutes,” the report continued.
“After less than 20 minutes watching Reels, the test accounts’ feeds were dominated by promotions for such creators, some offering to send nude photos to users who engaged with their posts.”
WARNING: The following contains graphic descriptions and references that some readers may find disturbing.
What’s worse, it sometimes took as little as three minutes for accounts identified as belonging to 13-year-old boys to be served the inappropriate content. Examples of the Reels that were served up to the users included a woman in a tight red dress, another in lingerie with the caption “M@$turb@ting,” and another video of a woman taking off her panties with the caption “When he says he’s into small Asian white sophomore nursing students…”
This apparently aligns with tests done internally at Meta, which suggests that the company’s pledge for an “age-appropriate” experience for younger users isn’t being followed through on.
And it’s not as if this is impossible to achieve; similar tests with Snapchat and TikTok, two social media platforms with similar content models, did not reproduce the same barrage of sexually explicit content being served up to what the algorithm believed was a 13-year-old.
“All three platforms also say that there are differences in what content will be recommended to teens,” Edelson said.
“But even the adult experience on TikTok appears to have much less explicit content than the teen experience on Reels.”
Meta, which announced new restrictions on Facebook and Instagram accounts that would treat minors drastically differently than adult users earlier this year, disputed the findings.
“This was an artificial experiment that doesn’t match the reality of how teens use Instagram,” said Meta spokesman Andy Stone.
“As part of our long-running work on youth issues, we established an effort to further reduce the volume of sensitive content teens might see on Instagram, and have meaningfully reduced these numbers in the past few months.”
However, there were two serious issues with this finding.
The first is that, according to the Journal, this isn’t a problem unknown to Meta. In fact, the Journal’s results closely match the company’s own.
“Internal tests and analysis by Meta employees have identified similar problems, according to current and former staffers and documents viewed by the Journal that are part of previously undisclosed company research into young teens’ harmful experiences on Instagram,” the report read.
“In 2021, company safety staff ran tests similar to those of Edelson and the Journal, and came up with comparable results, former employees said.”
“A separate 2022 internal analysis reviewed by the Journal found that Meta has long known that Instagram shows more pornography, gore and hate speech to young users than to adults. Teens on Instagram reported exposure to bullying, violence and unwanted nudity at rates exceeding older users in company-run surveys, and the company’s statistics confirmed that the platform was disproportionately likely to serve children content that violated platform rules.”
The second is that this isn’t a problem that’s coming under control. Despite Meta’s announcement in January that sexually explicit content wouldn’t be shown to users under 16, period, full stop, these tests were run between January and April. Another test by the Journal in June, using the same methodology, had the algorithm repeatedly recommending videos featuring women and content involving anal sex after only a half-hour online.
So, we are to believe one of two things. One, the most powerful, most influential social media company in the world right now cannot control sexual content being served up to minors the same way that TikTok and Snapchat can. Two, they don’t want to, or don’t consider it an urgent enough priority.
But then, what were we to expect? This is why we protect adolescents from pornography — or at least, in prior generations, tried to. We knew that hormones drastically impeded their ability to tell the difference between right and wrong, sexually. Pleasure replaced intimacy. This was what second- and third-wave “sex-positive” feminism cheered on. Liberation! Our bodies, our freedoms!
And companies took advantage of that. It was more difficult back in the day of the newsstand and the Playboy issue — but with the advent of the internet, all you had to do is say you were 18 and porn giants would feed you whatever filth the primitive parts of your brain wanted at that moment. You were the monkey in the box, pressing the button for the addictive substance — until you were numb, insensate and still desperate for more.
Now, you don’t even have to do that. One of the largest corporations on earth will do it for you. And you don’t even have to say you’re 18 anymore — 13 will apparently do.
We pretended that teenagers — especially teenage boys — wouldn’t behave like this. Did we not know who they were? Were we not teenagers once ourselves? Beyond that, this “feminism” and “sexual freedom” has left these poor young women pursuing digital prostitution — and that is what this is, make no mistake — because men don’t protect them from this debauchery anymore. To do so would be repressing them.
So now, we encourage it. We platform it. Silicon Valley neatly packages it and puts it in an app and serves it up to 13-year-olds, kids barely removed from chasing fireflies and playing tag. This is what’s happening, and let’s not pretend that this is some sort of one-off haywire test gone awry. This has long been a problem and it’s not going away, presumably because the people behind Instagram don’t think it’s important enough to make it go away.
So either Meta can’t stop the tidal wave of degeneracy being force-fed to your 13-year-old boy, or it won’t — because it operates in a bleak moral climate where, to do so, would be antithetical to the values the sexual revolution has left our society with.
Both possibilities represent bleak tidings for society in general, and for parents of teens in particular. One, however, seems more believable than the other, especially given the resources and the algorithmic complexity of Meta’s platforms.
You make the call. It’s not a difficult one.
We are committed to truth and accuracy in all of our journalism. Read our editorial standards.
Now loading...