Actions We Can Take to Address Misinformation and Safeguard the Freedom of Speech
Nicholas A. AshfordFollowing on my March 29, 2021 op-ed in the NYTimes, I have given much thought to addressing the societal challenge presented by misinformation and disinformation. Faulty information and outright lies about stolen elections, the anthropogenic causes of and urgent need to address global climate change, all things Covid-19 (its existence, the efficacy of vaccines and masking, and the promotion of dangerous and unproven therapies), the consequences of funding a larger safety net on inflation, and much more, are an increasingly prominent part of public discourse.
Finding a pathway to resolve this disinformation crisis is complicated, given the political and legal complexities of reducing First Amendment free speech protections, as well as increasing government anti-trust activity against the media giants. In my op-ed, I recommended using citizen juries, commissions, and a revitalized and expanded federal fairness doctrine. Law professor Michael P. Vandenbergh of Vanderbilt University has proposed a related approach in his law review article entitled “Social Checks and Balances: A Private Fairness Doctrine.” This work has subsequently greatly influenced my thinking.
My decades of scholarly work at MIT have concentrated on promoting government regulation of health, safety, and the environment, primarily related to the chemical, pharmaceutical, and automotive industries. I have eschewed the role of industrial self-regulation as too little too late. However, private regulation is not the same as self-regulation, and it also minimizes the pitfalls and conflicts inherent in regulation by a reluctant government, or by one captured by the very industry it is supposed to regulate. Before opining on a revitalized and expanded private fairness doctrine, with credit to the seminal work of Professor Vandenbergh, I provide some additional commentary on the current dilemma.
In general, the U.S. Constitution safeguards the freedom of speech from government interference, and lawmakers are understandingly reluctant to intervene. Yet politicians are increasingly concerned about the growing influence of online platforms. Both the House and the Senate are considering legislation that would revise Section 230 of the Communications Decency Act, which currently exempts so-called technology companies from being held liable for the material they publish. Facebook has been advocating the law’s reform.
The platform industry (Facebook, Google, Twitter, etc.) is also facing congressional scrutiny for potential antitrust violations. President Biden appointed Lina Khan, an anti-trust scholar and an advocate for its reform, as the new head of the Federal Trade Commission, with the expectation she would strengthen anti-trust regulation and enforcement, especially with regard to the platform giants.
But false and damaging information is not just an online problem. It’s also evident in broadcast, cable, and print media. As politicians debate whether or how to regulate technology companies, they should also consider addressing the dangers implicit in allowing and enabling the spread of misinformation, wherever and however it’s published.
And it is not at all clear that reducing the dominance of technology companies through traditional anti-trust regulation and enforcement will go far enough. Having a larger number of smaller companies to scrutinize could make the task of reducing misinformation even more difficult. Traditional anti-trust regulation and enforcement, as viewed by the courts, are heavily influenced by a concern for the effects of industry concentration on markets and innovation, an influence dominated by perspectives of the University of Chicago, my alma mater in law and economics. My critique of that perspective, as well as that of Lina Khan, is that the traditional concern with economic and innovation effects, or on economic exploitation of excessive power, overlooks much of the problem. Concentrated economic power gives rise to concentrated political power, which has an overwhelming effect on agenda-setting, policy implementation, and legislation. It is not at all clear the courts will reflect these concerns.
And oversight boards run by tech companies themselves, such as the one that Facebook created to hear issues of online safety and free speech, are not sufficient. Those efforts can never be truly independent if they are assembled by, and are financially tied to, the very companies they are tasked with overseeing. Furthermore, addressing only the platform industry is not the cure-all. Misinformation spread on one medium is reinforced and amplified by falsehoods spread on another. A catchy phrase based on a lie and spread via Facebook, Instagram, and Twitter – “stop the steal,” for example – becomes fortified and legitimized when it’s picked up by television and radio reporters or commentators, whose coverage give it a whiff of legitimacy and then reappear on social media, fueling a tornado of misinformation.
In 2020, the non-profit Washington League for Increased Transparency and Ethics filed a lawsuit against the Fox News Network, alleging ongoing violations of the Washington Consumer Protection Act in its coverage of Covid-19. In general, the complaint asserted that the “pervasive campaign of misinformation and deception” perpetuated by Fox News misrepresented the dangers of Covid-19 regarding the threat, let alone the reality, of the disease. These misrepresentations, it was alleged, caused “widespread confusion” and persuaded people to ignore the risks, and to ignore the federal and state government warnings and directives designed to stop the spread. Likewise, it claimed the deception, which is still televised by Fox News, caused the public “to fail to take appropriate action to protect themselves and others from the disease, mitigate its spread, and contributed to a public health crisis.” The lawsuit was dismissed, perhaps sealing the likelihood that the first amendment will enjoy any reinterpretation by the courts in the near future.
Decades ago, long before there was a technology platform industry to regulate, the Federal Communications Commission (FCC) instituted the fairness doctrine, a policy that required radio and television stations to present diverse points of view on controversial topics. The policy, which was designed to ensure that all sides of an issue be presented fully and fairly, was eliminated by the FCC in 1987, under President Ronald Reagan, and the implementing rule was removed from the Federal Register in August 2011.
I argue here that what we need is a new private fairness doctrine, premised on the public’s right to be fully informed, rather than on the government controlling free speech or regulating natural or otherwise beneficial monopolies.
Professor Michael Vandenbergh has recommended the creation of a private multi-stakeholder organization to provide independent oversight of mis- and disinformation in online, broadcast, cable, and print media. Through independent auditors – perhaps akin to citizen juries – this non-governmental private body would certify and grade the performance of the news media and be populated by independent and respected experts, appointed by the government, and presumably funded by a tax on the industry. The certification system would not require participation by individual news organizations; nor could they escape scrutiny under this system. A certification scheme would elicit market and social pressure, thereby creating competition for better performance. It is not “balance of views” that should be the ultimate objective of certification, but rather the extent to which accuracy and completeness is achieved in coverage and presentation of news.
Licensing schemes could reflect the outcomes of certification. It is worth remembering that both Fox News and MSNBC were founded after the 1987 demise of the original fairness doctrine. One wonders if they would have survived in their current form if the doctrine were in effect and applied to cable media.
Certification schemes may present a second-best approach to the problems discussed in this essay, but they may be as good as possible in today’s climate. Stakeholders who might manage the system may be more responsive than politicians, and the results could serve as a proxy for a social license to operate. Displaying the results of certification could affect what people read, listen to, and watch – and affect advertisers as well. Certification pressures can drive improvements, even if they do not affect licensure.
Public trust in the media industry has been declining for years. Rigorous fact-checking, along with broad and complete coverage of issues important to news consumers can help restore trust. While accuracy has received the most attention, completeness in presenting “the whole picture” deserves as much attention.
Psychology, behavioral science, and neuroscience have helped us understand why people are susceptible to misinformation and what influences how they view facts. Individuals gravitate toward news sources that reinforce their prior impressions, values, and opinions. This is known as confirmation bias or “anchoring to prior views.” Exposing people to more balanced sources might help expand their perspectives, but science tells us that this can also serve to strengthen current beliefs. The purveyors of faulty information need to be confronted with – and not escape responding to – opposing views and facts, in the manner occasionally common to some (but too few) media interviews and in cross-examinations in legal proceedings. But presenting all sides of an issue, by itself, may be passively received by a person, and may not serve to change one’s views.
The original fairness doctrine required media companies to present alternative points of view on sensitive issues. A reimagined and expanded version of this policy could enable an independent body to review inaccurate and incomplete material and induce technology platforms and the print, cable, and broadcast media to publish and respond to criticism.
I am persuaded that government proposals to break big tech and platform companies into several smaller ones or to reform Section 230 of the Communications Decency Act (by removing immunity protection for media from liability) will not solve the misinformation problem. But increased fact-checking by independent bodies and requirements to present more reliable perspectives will help. Because of the reinforcing influence one medium has on another, completeness and full scrutiny of ideas must be encouraged in the platform, cable, and broadcast industries, as well as print media.
There is clearly a need for more accountability in both the private sector and the government regarding the proliferation of mis- and disinformation. A new private fairness doctrine, coupled with independent oversight of the news industries, would help.