This Policy Paper is part of the Digital Platforms & Democracy Project’s efforts to explain and disseminate ideas about regulation of major technology and digital platform companies. Click here to read more of their research and commentary.
The views expressed in Shorenstein Center Policy Papers are those of the author(s) and do not necessarily reflect those of Harvard Kennedy School or of Harvard University.
Policy Papers have not undergone formal review and approval. Such papers are included in this series to elicit feedback and to encourage debate on important issues and challenges in media, politics and public policy. Copyright belongs to the author(s). Papers may be downloaded and shared for personal use under the Shorenstein Center’s Open Access Policy. Please contact the Shorenstein Center with any republication requests.
One hundred years ago the eminent British economist Arthur Pigou identified the problem of externalities, of a business not absorbing all of the costs associated with the goods or services it produced and sold. Classic examples of negative externalities are environmental pollution and health effects from tobacco. Today, in addition to the carcinogenic effects of chemical runoffs and first and second hand tobacco smoke, we have to contend with a new problem: the poisoning of our democratic system through foreign influence campaigns, intentional dissemination of misinformation, and incitements to violence inadvertently enabled by Facebook, YouTube and our other major digital platform companies.
One proposed response from the policy community has focused on Section 230 of the Communications Decency Act. The 1996 statutory provision shelters the platforms’ editorial judgments from liability to an extent otherwise unknown in our jurisprudence. It effectively holds them harmless with respect to third party content that they host or decline to host.
Advocates for the platform companies claim the provision is indispensable to the successful, socially-beneficial operation of their businesses. It is clear, however, that Congress was not motivated by a desire to spare platform companies from operating costs related to responsible editing. Rather, the intention was to encourage responsible editing by reducing the danger of potentially crippling damage claims.
Over the last few years, the production of proposals to adjust (or not) Section 230 has progressed from the artisanal to the industrial. The supply has flooded the market. This response reflects an unpleasant and undeniable reality. The business models of the major platform companies for all the good they produce enable exceptionally malign activities. And, experience shows that the companies have not made sufficient investments to eliminate or reduce these negative externalities.
This is a reality that has been apparent for years.
I have had the opportunity and obligation to attend to the geopolitical consequences of the rapid global growth of American big tech companies. During the first Obama Administration, I served as the U.S. ambassador for international communications and information policy, a position that entailed interaction with government officials whose countries were experiencing the benefits and, as they saw it, the costs of newly arrived, U.S.-produced electronic services.
From Berlin to New Delhi it often seemed that there was considerably more concern about the costs than the benefits. The concerns were variegated, ranging from general anxieties about privacy to specific instances of Internet-assisted fomenting of sectarian homicide.
But as differentiated as the concerns were, there was one reality that unified them. It was clear that the major U.S. platform companies had underinvested in controlling the unintended and unwanted effects of their services. In other words, there was inadequate attention to the negative externalities that the services were producing.
The inattention was reflected in part by a shortage of company officials who understood the laws, culture, and other salient circumstances in the affected countries. There simply weren’t enough people who could advise the platform companies about the ways in which a business model and service that reflected U.S. laws and customs would need to be modified to accommodate relevant concerns of a host country. At the time, I thought that the principal explanation for this was a function of the unforeseeably rapid international growth of Google, Facebook, and other services. There were, of course, less pleasant potential explanations—lack of sensitivity to legitimate differences between and among cultures and a disinclination to invest sufficiently to reduce the externalities being the principal ones.
In the intervening years, the major U.S. platform companies have overcome shortfalls with respect to representation in the countries in which they operate. They have sophisticated analysts addressing global cultural and geopolitical realities. And equally sophisticated local interlocutors available to address issues and anxieties that local government officials bring to their attention.
What they still do not appear to have is sufficient investment in ameliorating the problems that are a function of their fundamental business models. And that is true universally. The platforms are used to convey both the valuable and the destructive, the benign and the malign with equal efficiency not just abroad, but in the U.S. as well.
In the one hundred years since Professor Pigou identified the phenomenon of externalities two principal remedial approaches have emerged. Society can seek to tax the activity or it can seek to control it through specific commands. Neither of these approaches alone or in combination will be perfect, but it is clear that they can provide a measure of amelioration. To cite two recent proposals addressing the negative externalities thrown off by the platform companies, Paul Romer, the 2018 Nobel laureate, has proposed using taxes1https://www.nytimes.com/2019/05/06/opinion/tax-facebook-google.html while Paul Barrett of the Stern Center at NYU has proposed command devices.2https://issuu.com/nyusterncenterforbusinessandhumanri/docs/nyu_election_2020_report?fr=sY2QzYzI0MjMwMA
Either of these approaches could be used to induce additional investment—additional effort—by the platform companies. Both deserve serious consideration. But in the interim, a more modest approach might produce some amelioration.
A request that the Congressional Research Service report on the extent to which the platforms are making expenditures on prevention can be lodged by any member of Congress, no votes needed. Acknowledging the limitations inherent in mining platform companies’ financial accounting records, a compilation of reported expenditures over the last three years nevertheless would provide a starting point for consideration of the underinvestment issue. It would inform the Congress about the possible need to modify Section 230 or take other remedial action, and the fact of the study might reinforce the platforms’ incentives to invest in reducing negative externalities.