Big Tech Is The New Big Tobacco

When it comes to the harmful effects social media is having on young people, Clare Morell and fellow researchers at the Institute for Family Studies and the Ethics and Public Policy Center see the writing on the wall — and it’s a devastating story.

“If we don’t take action soon, I do really think we’re going to see a public mental health crisis among the teens and kids who are growing up on social media today,” says Morell, co-author of the new report, “Protecting Teens from Big Tech: Five Policy Ideas for the States.”

Increasingly, the data is clear that these social media apps cause anxiety, depression, self-harm, eating disorders, and suicide, Morell says. “We’re going to see an epidemic — and it’s already starting — of online pornography addiction, and what that means for the future of our country is the destabilization of marriages and families. I don’t think it’s inappropriate to say without taking any action, within a few years, within one generation, we could be headed toward a civilizational crisis, like in Japan, where the birth rate has fallen below replacement.”

The authors warn in the report:

One day, we will look back at social media companies like ByteDance (Tiktok) and Meta (Facebook and Instagram) and compare them to tobacco companies like Philip Morris (Marlboro) and R.J. Reynolds (Camel). For a time, Big Tobacco enjoyed immense profits and popularity. But eventually, Big Tobacco’s culpability in causing immense physical harm to Americans — and in trying to obscure the science regarding that harm — became known. They were eventually held accountable for their deceptive advertising to children using “Joe Camel.” We are living at a moment when we are just learning of the social and psychological harms of social media, and of Big Tech’s efforts to obscure those harms from the public.

Federal-Level Failures

Morell says the new report was prompted not only by the research she and fellow scholars were already doing, but also by “a growing desire by different states and state legislators to do something about this issue.”

The report points to the federal government’s inaction on the increasingly urgent problem of kids and social media as the impetus for states to take matters into their own hands. The authors write that while national indecency laws have limited harmful content in motion pictures and on television, “Federal law has not focused on the unique disruption to children’s psychological development that social media’s pervasive presence appears to cause.”

The Child Online Protection Act (COPA) of 1998 sought to require age verification for minors visiting sites with obscene content, but after several rounds of litigation, the law never took effect.

The Children’s Online Privacy Protection Act (COPPA) of 1998 is, in theory, supposed to allow parents to control the interaction between websites (which now include social media platforms) and children, but due to several loopholes, it fails. “In fact,” the authors write, “because it preempts state torts in the area of children’s online privacy, it is arguably worse than nothing.”

Section 230 of the Communications Decency Act passed in 1996 was also intended to protect children online, but Morell says this code has been “all carrot and no stick,” because it “empowers companies to take down lewd content without liability for those decisions, giving them immunity and protection to moderate that type of content — but there’s no corresponding, legal duty or penalty to make them take it down. They know these types of things are proliferating on their platforms, kids are seeing it, and there is protection if they do decide to be Good Samaritans and remove it, but pornography and obscene content is what’s most sensational and keeps people engaged on these platforms — it’s how companies sell more ads and make more revenue, so they have no incentive to do anything about it.”

Legislative Solutions

Morell and her co-authors advocate for Congress updating COPPA and Section 230.

“We need to hold these companies accountable for not removing content that is objectionable,” Morell says. “The Trump administration said there should be a ‘Bad Samaritan’ carveout, meaning if you’re a company just allowing criminal content to circulate, then you shouldn’t get the immunity that Section 230 provides.”

Companies aren’t being held accountable for failing to keep kids off platforms, the authors note. To date, companies have had few incentives to require robust age-verification because they have not been held liable for minors under age 13 being on their platforms. That occurs because COPPA currently only covers platforms that have “actual knowledge” that users are underage. This is one of the highest legal liability standards and almost impossible to prove in a court of law. If Congress changed COPPA’s standard to “constructive knowledge,” it would help this issue by making platforms responsible for what they “should know,” given the nature of their business and the information they already collect from their users.

States Can Require Parents to Be Involved   

It is, of course, the parents’ responsibility to oversee their child’s activities, but parents can’t be everywhere all the time and are sometimes just simply unaware of the harmful forces that are influencing their children. What’s more, even some parents who would like to monitor their children’s online interactions more closely are restricted by the cost of privacy control software (which often falls short anyway).

“The emphasis [of our proposals] is to empower parents to protect their kids,” Morell says. “Certainly there are uninvolved parents out there, and that’s part of the reason that we propose the solutions that we do. We want to protect all children, whether their parents are taking an active role or not, by requiring parents to be involved. If a child says, ‘Mom, I need you to put in your information to create this account,’ parents would have to play a part. If companies require a parent on an account with a child, the parent will see friend requests that come in, bad actors, who their kids are interacting with, posts they’re seeing, and so forth.”

Morell and her fellow policy experts list five actions that states can take now while they wait for Congress to enact more rigorous requirements for online companies. They include: mandating robust age-verification measures for social media platforms by requiring a driver’s license, credit card numbers, or another form of identification to create an account; requiring parental consent for minors under 18 to open a social media account; mandating full parental access to minors’ social media accounts; requiring social media companies to shut down access to their platforms for all 13- to 17-year-olds’ accounts during bedtime hours (generally 10:30 p.m. – 6:30 a.m.); and including a private cause of action to enable parents to bring lawsuits on behalf of their children against tech companies for any violation of the law.

How It Would Work, Practically Speaking

Morell foresees internet companies’ dislike of “patchwork state laws” as working to the benefit of parents and children, since altering how they do business in one state would likely spur them to make their policies identical across the country.

Morell adds that the private cause of action clause is key, because, “If individual parents are empowered to bring a private lawsuit against these tech companies for violating the law, that could be very costly to their business, and they would take that seriously. [Private cause of action suits] are one of the most effective means of enforcing laws.”

Morell says the proposed state laws are “novel” and have been created by taking different legal precedents used in other settings and applying them to protecting kids online. The Texas Public Policy Foundation also recently called for banning social media for minors.  

As parents wait for lawmakers to hold tech companies accountable for profiting off the vulnerable minds of children, it is critical that parents monitor their children’s online activities with special scrutiny and impress upon fellow parents the vital importance of remaining vigilant.


Teresa Mull is an assistant editor of Spectator World and writes from the Pennsylvania Wilds.

The Federalist logo eagle mark

Unlock commenting by joining the Federalist Community.

Subscribe


Read More From Original Article Here:

" Conservative News Daily does not always share or support the views and opinions expressed here; they are just those of the writer."
*As an Amazon Associate I earn from qualifying purchases

Related Articles

Sponsored Content
Back to top button
Available for Amazon Prime
Close

Adblock Detected

Please consider supporting us by disabling your ad blocker