Is Big Tech Bum Rushing The Supreme Court On Censorship?
NetChoice v. Paxton—the lawsuit that may determine the fate of free speech on social media platforms—has taken a dramatic turn. Just short of two weeks ago, the large platforms—including the likes of Amazon, Google, Twitter, and Facebook, all acting through their trade group, NetChoice—made an “emergency application” to Justice Samuel Alito.
This sort of application is familiar in cases involving grave harm, such as an execution. But is there really any risk of such harm or other emergency in this case? Or are the platforms trying to bum rush the Supreme Court so as to sidestep the ordinary course of judicial inquiry? The Supreme Court needs to be careful that it is not being manipulated.
Big Tech Doesn’t Like Texas’s Law
The case arises out of the Texas free speech statute that bars the largest social media platforms from discriminating on the basis of viewpoint. In response, the platforms claim their censorship of speech is protected by the First Amendment.
Texas counters that they are common carriers, which serve as conduits for other people’s speech, and so can be barred from discriminating on the basis of viewpoint. In other words, the platforms are not being restricted in their own speech, but only barred from discriminating against the speech of others that they carry in their conduits.
These are complex questions, and even the slightest hint from the Supreme Court as to its answers will have outsize implications in the courts below. It therefore is disturbing that the platforms, speaking through NetChoice, have asked the court to take a position in a rushed “emergency application.” Such portentous questions should not be decided in a hurry. So why do the platforms want them resolved in proceedings that were briefed on only a few days’ notice?
Already in the district court, proceedings were accelerated, because the platforms sought a preliminary injunction. And because the platforms were seeking to bar the Texas attorney general from enforcing the statute, there was no discovery on a key question for the constitutionality of the statute—namely, whether the platforms have been cooperating with government to censor Americans. Such coordination is very dangerous, and states have a compelling interest in preventing it.
Unequal Legal Battlefield
The platforms are among the wealthiest litigants in the world. Their money flows through almost all major law firms—so widely that it is difficult to find a major law firm to represent the other side without a conflict of interest. The litigation in NetChoice v. Paxton has therefore always been imbalanced. One side had well-lawyered think tanks to offer amicus briefs; on the other side, individuals scrambled to find small firms willing to file briefs pro bono and dug into personal savings to pay printing costs.
Nonetheless, after extensive briefing, the Fifth Circuit held against the platforms. It ordered a stay of the district court’s preliminary injunction barring the Texas attorney general from enforcing the statute. Two days afterward, on a Friday, the platforms, through NetChoice, made their emergency application to the Supreme Court to vacate the stay.
Of course, the platforms had plenty of amicus briefs lined up within the deadline a few days later. Many amici on the other side, in contrast, did not even learn about the application until it was too late to file. Or they struggled to get a brief done, only to find that they could not secure pro bono counsel for filing in the Supreme Court on such short notice. How convenient.
Is This Really an Emergency?
An emergency application to the Supreme Court is ordinarily justified for real emergencies—to stop irreparable harms. Indeed, NetChoice argues that if the platforms were subjected to the Texas statute’s anti-discrimination requirement, they would suffer a “substantial threat of irreparable harm.” So, is there really an emergency or irreparable harm that requires interference by the Supreme Court in the Fifth Circuit’s decision?
The platforms claim that if the Texas attorney general can enforce the statute, they will have to spend billions to comply or “face the threat of, at minimum, ‘daily penalties.’” That sounds dire. But it is not quite true.
In fact, the Texas anti-discrimination statute permits the attorney general or private parties to seek only injunctions or declaratory judgments. It doesn’t authorize any damages or other penalty for the platforms’ discriminatory censorship. Penalties only come into play if, after a trial, there were a court judgment against the platforms and they were to refuse to comply.
The statute thus is the mildest of laws. It imposes no legal consequences on the platforms for disobedience until after a trial and judgment. Even then, the only consequence is an order to comply. Its penalties are merely for failure to obey a court order.
If the platforms believe that following HB 20 will hurt their business, they are not constrained to comply. If they win their challenge to the law, the platforms will be vindicated. If they lose, then they will change their behavior and comply—but face no consequence for their prior failure to comply.
In other words, the only risk for the platforms is that they will have their claims decided in ordinary and orderly court proceedings. That is not irreparable harm, let alone an emergency.
Why Claim an Emergency If It’s Not?
So why claim an emergency? Why use implausible arguments of irreparable harm to get an emergency order from the Supreme Court?
That’s where it gets interesting. The effect of the platforms’ strategy is to rush the Supreme Court into deciding the constitutional questions without the full briefing and arguments that the questions deserve. The special proceedings supplant careful inquiry and reasoning with demands that the court act quickly, even perhaps spasmodically.
Just why the platforms might want to avoid more careful consideration of the issues is apparent from their emergency application to the Supreme Court. It repeatedly makes claims that are at best questionable.
Tech Companies’ False and Questionable Claims
The emergency application claims the Texas anti-discrimination statute would prevent the platforms from removing pornography and spam. But this is simply untrue. The statute bars viewpoint discrimination, not content discrimination, and so leaves the platforms free to remove porn and spam.
The statute applies only to platforms with more than 50 million monthly active users in the United States, and the platforms claim this number is arbitrary in excluding smaller platforms and that there is “no . . . legitimate reason” for it. But, as is abundantly clear from earlier briefing in the case, the focus on the largest platforms ensures that the statute applies only to what are genuinely communications common carriers. Such carriers—conduits for other people’s speech—can be defined functionally or by market dominance, and the size qualification ensures that the statute meets both definitions.
The platforms’ application also claims that the statute’s definition of social media platforms “is content based, because it excludes certain websites based on content—like news, sports, and entertainment.” But that statement is grossly misleading. In fact, as the emergency application notes elsewhere, the statute excludes services that consist “primarily of news, sports, entertainment, or other information or content that is not user generated.”
Thus, the statute’s recitation of “news, sports, and entertainment” information is just illustrative. The statute goes on to exclude services that provide any other type of information or content that is not user-generated. So the claim of a content-based definition of the platforms is simply not true.
The platforms even claim they have an expressive right to exclude some colors of opinion, because this exclusion is an expression of the platforms’ views. Similarly, railroads once thought they were constitutionally protected in excluding some colors of persons. Undoubtedly, this exclusion expressed their views. But conduits or common carriers do not have an expressive right to discriminate.
Topping it off, the platforms say that Florida’s law against Tech censorship is “similar” law—even though they are very different. The point is to create confusion so that Court will attribute the poorly drafted Florida fiasco to Texas.
Big Tech Appears Scared of the Truth Coming Out
Why would the platforms make so many crudely erroneous or at least misleading statements to the Supreme Court?
The answer lies in what appears to be their strategy—to bum rush the Supreme Court. The claims of “irreparable harm” and an “emergency” have no basis in fact. Because the relevant portions of the statute are without damages remedies, the platforms could easily just wait for private claims against them to wind their way through the regular processes of the courts, with time for detailed inquiry. But that is precisely what the platforms cannot afford.
And it is not just that their arguments are weak; they also surely are worried about discovery. If by claiming an “emergency,” they can elicit discouraging words from the court about the Texas statute, the platforms can escape the in-depth discovery that would otherwise occur in private suits against the platforms. For example, they can avoid discovery about their cooperation with government in censoring Americans.
So they seek accelerated special proceedings—to avoid what eventually would be full discovery and to avoid careful scrutiny of their erroneous claims. And to get those fast-moving proceedings, they make specious claims of irreparable harm.
It would be shameful for the Supreme Court to decide the constitutional questions in these circumstances. Emergency proceedings are no substitute for careful deliberation. What is needed is the ordinary and orderly due process of the law.
Philip Hamburger is the Maurice and Hilda Friedman professor of law at Columbia Law School, and the president of the New Civil Liberties Alliance.
" Conservative News Daily does not always share or support the views and opinions expressed here; they are just those of the writer."
Now loading...