Just as COVID-19 is a stress test of every nation’s health system, an election process is a stress test of a nation’s information and communication system. A week away from the US presidential election, the symptoms are not so promising. News reports about the spread of so-called “fake news,” disinformation, and conspiracy theories are thriving as they did in 2016.
Disinformation and “fake news” are not new, but the 2016 US presidential election placed the phenomenon squarely onto the international agenda. The spread of false and manipulated information dressed as news is closely associated with social media platforms such as Facebook, Twitter, and YouTube. In a 2018 study, researchers examined the exposure to misinformation during the American election campaign in 2016; they found that Facebook was a key vector of exposure to fake news.
It becomes harder to differentiate between false and trusted information when supposedly everyone can publish and spread information online that looks like news to large groups of people. The spread of disinformation and conspiracy theories has been identified as a problem in several states, for example in Florida, and news publications, such as the New York Times, are daily tracking viral misinformation ahead of the 2020 election.
While disinformation and foreign influence was of great concern in the 2016 election, disinformation from domestic sources is additionally reported as a major threat in the 2020 US election. The spread of fake new, rumors, and conspiracy theories is problematic in itself, but the main damage of such orchestrated campaigns might be the systematic erosion of citizens’ capacity to recognize facts, the undermining of established science, and the sowing of confusion about what is real or not.
The COVID-19 pandemic has demonstrated how a health situation dominated by uncertainty and the lack of a vaccine makes the rumor mill turn faster than ever. A new study by the Oxford Internet Institute and the Reuters Institute for the Study of Journalism reveals that coronavirus-related misinformation videos are predominantly disseminated through social media and that Facebook is the primary channel for sharing misinformation due to a lack of sufficient fact checks in place to moderate content. Another study found that one in four popular YouTube videos on the coronavirus contained misinformation, while more than 1,300 anti-vaccination pages on Facebook had nearly 100 million followers.
Countering disinformation and fake news has become such a major issue that international institutions such as the United Nations, the European Union, the World Health Organization, and the World Economic Forum have published reports and recommended actions for how to tackle disinformation, particularly electoral and health disinformation. In June 2020, more than 130 United Nations member countries and official observers called on all states to take steps to counter the spread of disinformation, especially during the COVID-19 pandemic.
Nevertheless, tackling false and manipulated information is far from straightforward. It requires a complicated balancing act between countering disinformation and protecting freedom of speech. A report from the United Nations Educational, Scientific, and Cultural Organization (UNESCO) recommends how to avoid sacrificing freedom of speech in the fight against fake news and disinformation. The report warns against “quick fixes” such as “‘fake news’ laws” and other measures to curtail viral disinformation, which may end up censuring legitimate journalism or legitimate criticism of authorities.
The UNESCO report suggests four main measures to identify and address fake news, disinformation, and misinformation—of particular concern during election campaigns. The report suggests a range of measures, from policy and legislative approaches to technological efforts and media and education literacy initiatives, in order to identify the problematic information, its producers, the distribution mechanism, and its targeted audience:
1. Identification responses (aimed at identifying, debunking, and exposing disinformation)
- Monitoring and fact-checking
During the US election, news media have conducted live fact checks of the presidential debates, and major hoaxes have been identified and debunked. But to identify and expose all disinformation spreading during the election campaign, particularly on social media, is hardly possible.
2. Responses aimed at producers and distributors (intended towards the altering of the environment that governs and shapes behavior, i.e. law and policy responses)
- Legislative, pre-legislative, and policy responses
- National and international counter disinformation campaigns
- Electoral responses
Based on evaluation from independent fact checkers, Facebook and Twitter have marked electoral disinformation, including from president Donald Trump.
3. Responses aimed at the production and distribution mechanisms (pertaining to the policies and practices of institutions mediating content)
- Curatorial responses
- Technical and algorithmic responses
- Economic responses
Bots—automated Twitter accounts—are spreading disinformation and sowing division in America. A Carnegie Mellon University study found that nearly half of accounts tweeting about the coronavirus were likely bot, and as a response Twitter has unveiled new labels that will accompany misleading, disputed, or unverified tweets about the coronavirus.
4. Responses aimed at the target audiences of disinformation campaigns
- Ethical and normative responses
- Educational responses
- Empowerment and credibility labelling efforts
Several organizations and groups are offering training and tools both for citizens and journalists to increase skills in fact checking and verification. One of them, the project First Draft, offers tools and training to build resistance against misinformation.
Electoral disinformation is of specific concern because it can damage democratic processes and reduce citizens’ rights. Electoral responses to disinformation can thus include a range of real-time detection, such as election-specific fact checks, election ad archives, as well as debunks, counter-content, and retrospective assessments. They can also entail campaigns linked to voter education and regulations about electoral conduct.
The health of a democracy’s information system is critical, especially during election campaigns. By applying some or all of these measure during the US election, as well as other election campaigns in the near future, it might be possible to protect democratic elections from disinformation and increase citizens’ capacity to recognize facts.
Feature image by Kayla Velasquez