Is Clarence Thomas Right About Big Tech?

Today, EPPC President Ryan T. Anderson and Faulkner University law professor Adam J. MacLeod published a thought-provoking article at National Review walking through some of the arguments that Supreme Court Justice Clarence Thomas made concerning online governance and content moderation in his concurrence released alongside the court’s decision on Biden vs. Knight First Amendment Institute at Columbia University. This particular case was ruled moot by the court, and the lower court decision was dismissed. The case was originally titled Trump vs. Knight. It was changed with the inauguration of Joseph R. Biden as president of the United States since the case revolved around the question of the president’s ability to block access to the public on a social media platform.

I will say at the outset here that I am not a lawyer or an expert in constitutional law. I am an ethicist who has been studying these issues of online governance from a Christian perspective for a number of years. Overall, I appreciate their engagement on these issues but think a few areas of their argument for regulating the technology industry need refinement. Briefly, I want to share a few questions and concerns about their approach, primarily focused on how these good faith approaches may end up doing more harm than good.

Online governance and content moderation

First, I am incredibly thankful for the engagement by fellow conservatives with these complex issues of online governance and content moderation. In the last few years, there have been far too few conservatives and people of faith interested in engaging the issues that affect the public square. I have sat in countless rooms with experts and lawyers across the ideological perspective, often being one of the few socially conservative voices in the room. But recently, there have been growing concerns from the political right about the outsized influence and power of major technology companies like Facebook, Twitter, Google, and Amazon to regulate the flow of public discourse, especially on some of the most contentious issues like hate speech, gender/sexuality issues, violence, harassment, free speech, and the suppression of religious expression.

The authors raise some salient points about the market dominance of many of these companies and the ability for them to effectively silence certain ideological viewpoints on their platforms, in particular completely erasing Anderson’s book on transgenderism from the Amazon marketplace. As I have previously written, many of the content moderation policies are ill-informed, dangerously broad, and easily misinterpreted by those who serve our communities by moderating content each day. This spawns from a lack of real diversity in developing policies which by and large tend to subjugate religious perspectives to purely private, personal matters without any standing on public affairs or governance.

Common carrier debate

Second, one of the biggest questions that needs to be asked in these debates is whether—or more importantly in what way—the government should intervene in these matters. In his concurrence, Justice Thomas lays out two different scenarios for regulating this industry: common carrier provisions and treating these companies as public accommodations. Anderson and MacLeod pick up on these arguments and put forth a thoughtful argument for common carrier law to apply in the case of Big Tech as it has in the past concerning telecommunications networks. The authors note, “In many respects, Big Technology companies are like common carriers, such as telegraph and telephone companies, a legal status that entails a duty to serve everyone.” But are they really analogous?

One area that seems to be missing in their argument is something that Justice Thomas rightly raises about applying that designation in the case of technology companies today. While Justice Thomas argues that the “common carrier” designation has been applied to other industries with considerable market size, such as those in transportation and communication, he also acknowledges that “applying old doctrines to new digital platforms is rarely straightforward.” Contra Anderson and MacLeod’s argument, social media networks and platforms like Amazon are not merely carriers of public information. 

These platforms curate the information presented to users and—to the consternation of many—you do not see every bit of content posted by those you follow or friend in real time, if at all. Each of these companies, particularly social media, employs a sophisticated (and at times problematic) algorithm to personalize the experience for each user. The algorithm employed in these cases is one of the major issues in the current debate over fighting online extremism, conspiracy theories, and the rise of misinformation. Nick Clegg, the vice president of global affairs at Facebook, recently explained that there is a complex relationship between the algorithm and our personal content choices online. While I don’t agree with every element of Clegg’s argument, he nevertheless illustrates that these companies are not simply a conduit for information exchange by the public, which a common carrier designation would seem to infer.

Section 230 and immunity

Third, another point raised near the end of the piece is the debate over a controversial 26 words of the 1996 Communications Decency Act, which have become ground zero for misunderstanding but also considerable debate over online governance. The authors describe Section 230 by saying it “immunizes website platforms from liability for certain third-party content,” which they use to support their argument by saying that this immunity pushes the companies from merely public accommodations to common carriers. But that is an under reading of Section 230 in my opinion.

In light of these protections, internet companies and platforms are encouraged to enact “good faith” protections and to remove content that is “obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.” While the debate over 230 centers around what exactly does “good faith” and “otherwise objectionable” mean in this context, the statute specifically calls for the companies and platforms to moderate content. Without this liability shield, these companies may choose to abandon all content moderation or overly enforce content standards. This means more content may very well come down as a result of 230 reform because the companies rightfully desire to protect themselves from litigation.

This is one of the questions at the heart of the debate over Section 230’s usefulness today, which has led to many proposals to mend 230 rather than simply remove those protections or adding some type of neutrality clause, which necessitate the question about what constitutes neutrality in the first place. Some, particularly on the political left, argue that the companies simply aren’t doing enough to combat these issues and bear significant, if not total, responsibility for the state of the digital public square. Others, particularly on the political right, argue in line with Anderson and MacLeod that the companies are doing too much moderation of content based on ideological positions at odds with certain political perspectives. But to argue that the companies simply have immunity without giving the context for the moderation of content seems to overshadow the complexity of the debate over 230.

I agree with the authors that “American law has never allowed private businesses to do whatever they want, and some regulation of Big Tech companies can be justified consistent with natural law and natural rights.” The question isn’t whether reform is needed, but what type and by what method. My main concern here is that the common carrier designation, combined with lack of substantive engagement with Section 230, does not seem to provide a solid basis for regulating this industry. Obviously, there are many questions to answer and debate that needs to be had, but I hope that many more conservatives will honestly address the complexity of these debates rather than simply relying on social media’s megaphone to blast messages into the ether. We need more thoughtful participants from all sides of the public debate to enter into conversation with one another and with the industry. These issues are not insurmountable but will require a sustained presence and engagement with the tenets of the debate as they are, rather than just what we simply wish them to be.