Don’t look now (a heckuva way to begin a piece of writing!), but I may have come up with one solution to the incredibly complex and just as important national dilemma over regulating how gargantuan social media platforms like Facebook and Twitter handle Americans’ speech rights.
First, let me stipulate that I’m anything but an expert on the Constitution, law and regulation of any kind (except maybe in the international trade field), or technology of any kind. But maybe I know enough to have produced a plan that’s outside-the-box enough to break the various legal and political and philosophical logjams that have left the nation with a status quo that seems to satsify no one, but that’s anchored in reality.
In addition, the thoughts below were prompted by a very stimulating panel discussion involving genuine experts in all these fields that took place this past weekend at a wide-ranging policy conference held by the Intercollegiate Studies Institute. (I spoke on a separate panel on China.) So my ideas aren’t coming from completely out of the blue.
The nub of the problem is that Americans across the political spectrum are furious with the platforms’ speech policies, but for radically different reasons. Those to the left of center blast them for posting what they view as misinformation. Their conservative counterparts claim that right-of-center views are too often censored – typically because they’re bogusly accused of spreading misinformation.
All sides seem to agree that the platforms’ practices matter greatly because, due largely to their algorithmic amplification powers, they have such power to turn material viral that they’ve achieved the massive scale needed to become a leading – and often the leading – way in which Americans receive news, opinion, and other forms of information that affect politics and public policy. But towering obstacles stand in the way of pretty much every proposal for reform advanced so far.
For example, their status as private companies would appear to block any move to empower government to influence their speech practices. Antitrust specialists disagree strongly as to whether they’re now monopolistic or oligopolistic enough under current or even proposed legal standards to warrant breaking up. The companies themselves of course deny any such allegations, and contend that if they needed to downsize, they wouldn’t be able to compete effectively around the world with foreign counterparts – especially those from China. Some have proposed turning them into public utilities, but opponents call that a great way to stifle any further innovation.
So here’s my idea: Turn the platforms into a new type of entity that would be subject to a new body of regulation reflecting both the distinctive importance of free expression in American life and the distinctive (and indeed predominant) role that the platforms now play in enabling individuals and organizations both to disseminate material, and (stemming from an aspect of free expression rights that’s often overlooked, but that’s now unquestionably vulnerable due to the main platforms’ sheer scale and reach) to reach their potential audiences. One possible name: Electronic Speech Companies (ESCs).
As history demonstrates, there’s nothing unusual about the federal government organizing private business into different categories for tax purposes, and there’s nothing unusual about government at any level regulating such businesses with an unusually heavy hand because of their outsized role in providing vital goods and services. That should be clear from the long-established policy of creating utilities. So I don’t see any Constitutional problems with my idea.
I agree that government’s price-setting authority over utilities can stymie innovation. But ensuring that these entities don’t curb free expression any more than (legally) necessary (see below) wouldn’t require creating such authority. I’d permit these ESCs to charge whatever they want for their services and to make money however they like (including selling users’ personal information – which does raise problems of its own, but which are unrelated to the speech issue). As currently required by the controversial Section 230 provision of the Communication Decency Act of 1996, they wouldn’t be able to disseminate any content that’s already illegal under federal criminal law, intellectual property law, electronic communications privacy law, or (most recently) criminal and civil sex trafficking law.
I’d also make them subject to current libel law – which means that plaintiffs would need to prove that false and defamatory information had been spread maliciously and knowingly. Could this rule mean that now-incredibly clogged U.S. courts would become more incredibly clogged? Sure. So let’s also set up a separate court system to handle such cases. Since a dedicated tax court system already exists, why not?
Frivolous suits could be reduced with “loser pays” requirements for court costs. The Big Tech defendants would doubtless still hold a huge advantage by being able to hire the very best legal minds and driving those costs up by dragging out proceedings. But a number of legal non-profits have emerged over the years to help the little guys and gals in these situations, so maybe at least the potentially most important and promising suits wouldn’t be deterred by financial considerations.
What the ESCs wouldn’t be permitted to do is bar or delete or modify any content, or any users, on misinformation grounds. Advocates of continuing to permit and even further encourage or require such practices argue that the platforms’ vast scale requires greater discretionary and often required authority along these lines in the name of any number of good causes – election integrity, public safety, national security, etc. (See, e.g., here.)
But three counter-arguments are more persuasive to me. First, I can’t imagine developing any legal definition of misinformation (as opposed to libel or other well-established Constitutional speech curbs) that would be genuinely neutral substantively and that therefore wouldn’t be easy to abuse massively – and to the great detriment of our democracy’s health, due to the platforms’ scale.
Second, that’s no doubt why such regulations have absolutely no precedent in U.S. history, despite past periods and instances of intolerance dating from the passage of the Alien and Sedition Acts of 1798.
Third, if the ESCs are going to be held liable for disseminating etc misinformation, what excuse will there be to maintain protection for the rest of the news media? I’ve spent much of my multi-decade career in policy analysis finding instances that would unmistakably qualify. Not that ongoing and arguably worsening conventional media irresponsibility is any cause for complacency. But would a government remedy for such an intrinsically nebulous offense really result in a net improvement?
Individual victims of ESC censorship would, however, need remedies for these forms of cancellation, and as with libel and slander, a special court system could handle accusations, using the aforementioned provisions aimed at leveling the legal costs playing field. The Justice Department could file its own suits, too, and some seem likely if only because its own inevitable political sympathies are bound to shift as power in Washington changes hands over time. This prospect, moreover, should help keep the ESCs on their best behavior.
The big danger of my proposal, of course, is that misinformation would keep appearing and metastasizing online, and spreading like wildfire offline due to the ESCs’ extraordinary reach. That can’t be a healthy development. But it’s surely an unavoidable development for anyone valuing any meaningful version of free expression and its crucial corollary – the marketplace of ideas. For empowering a handful of immense ESCs to restrict misinformation threatens to narrow greatly and even fatally the competitive essence of this marketplace.
Throughout U.S. history, Americans have relied on these dynamics, and the common sense of the public, to crown as winners the best ideas and the benefits they bring, and declare as losers those that have either caused or threatened serious dangers. Is anyone out there prepared to deny seriously that the results, though imperfect, have been historically excellent, that the potential for improvement remains just as impressive, or that any alternative yet proposed looks superior? If not, then I hope you’ll consider this ESC plan at least a promising framework for ensuring that these digital giants don’t become the ultimate arbiters.