Is technology the only way to solve technology-driven misinformation?

Mar 24, 2021

Technology-driven misinformation has become so powerful that it is impossible to reply on individuals to sift through the ‘facts’. Is technology-driven truth detection our only way out?

The apocryphal King Canute story has over the centuries morphed into a legend which shows the futility of trying to stop the tide.

In fact Canute was actually trying to demonstrate to his courtiers (and perhaps his god) his humility and recognition of limits to his power. The latter day legend, in contrast, is an example of misinformation promulgated over centuries.

While there is has been much misinformation promoted over the centuries. For the English and Australians, for instance, misinformation about the impact of monarchy on political stability has morphed into a legend beloved of royalists. It is a legend which conveniently ignores Charles I, Harold and William the Conqueror, Richard III and assorted other depositions and civil wars.

We need to interrogate many sources – ancient chronicles, archaeology, partial archives, folk memories (Guy Fawkes for example) – to find and evaluate the misinformation which has shaped many of our most enduring beliefs.

But today all we need to do is to go online, read a Murdoch newspaper, listen to an LNP backbenchers, parse a Scott Morrison speech or media grab to get overwhelmed by a massive tide of misinformation.

The George Mason University Center for Climate Communications (4C) academics are not only seeking to inoculate us against conspiracy theories and this tide of misinformation but also to develop a systematic approach, the 4D Project, which can help combat it.

They don’t pretend it’s easy. Project leader, Dr John Cook says: “Misinformation is a multi-faceted problem, influencing society at political, social, technological, and psychological levels. Therefore, effective responses to misinformation require multi-disciplinary approaches.”

The aim is to develop the ‘holy grail of fact-checking’- “systems that automatically detect and neutralize misinformation about important science-based topics and issues. This can only be achieved by synthesizing research findings from computer science, political science, philosophy, psychology, and communication.”

Interestingly such an approach also underpins the most effective public relations methodologies although that often doesn’t have such high-minded aims as the 4D project does.

“The 4D Project will synthesize four lines of research: Detection (automatically detecting online misinformation); Deconstruction (identifying the exact nature of the misinformation); Debunking (implementing proven refutation approaches); and Deployment (inoculating and debunking in a variety of social contexts),” 4C said in announcing the project. 

One of the related outputs from the Project and the work on conspiracy theories (see Part 1) is a Debunking Handbook  which was published in 2020 (see Part 3).

To start the project the 4D team carried out an experimental exploration of the way “public misconceptions about climate change can lead to lowered acceptance of the reality of climate change and lowered support for mitigation policies.”

It looked at the impact of misinformation about climate change and tested several pre-emptive interventions designed to reduce the influence of misinformation and “found that false-balance media coverage (giving contrarian views equal voice with climate scientists) lowered perceived consensus overall, although the effect was greater among free-market supporters.”

“Likewise, misinformation that confuses people about the level of scientific agreement regarding anthropogenic global warming (AGW) had a polarizing effect, with free-market supporters reducing their acceptance of AGW and those with low free-market support increasing their acceptance of AGW.

“However, we found that inoculating messages that  explain the flawed argumentation technique used in the misinformation or that highlight the scientific consensus on climate change were effective in neutralizing those adverse effects of misinformation”, the Project reported.

The urgency of the project is not only illustrated by the success of conspiracy theories and other misinformation but also some recent research into political polarisation.

In a paper in the PNAS, The objectivity illusion and voter polarization in the 2016 presidential election, Michael C. Schwalbea, Geoffrey L. Cohena, and Lee D. Rossa, find that “Political polarization increasingly threatens democratic institutions (and) The belief that ‘my side’ sees the world objectively while the ‘other side’ sees it through the lens of its biases contributes to this political polarization and accompanying animus and distrust”

Well that’s not news you might say but part of the value of the research is its predictive value.

The belief is known as the ‘objectivity illusion,’ and the paper said it “was strong and persistent among Trump and Clinton supporters in the weeks before the 2016 presidential election.”

The authors show that this ‘objectivity illusion’ can predict subsequent bias and polarization, such as partisanship over things like the presidential debates.

Most disturbingly a follow-up study showed that “both groups impugned the objectivity of a putative blog author supporting the opposition candidate and saw supporters of that opposing candidate as evil.” When partisanship becomes a question of good and evil we get the Capitol insurrection and attempted coup.

Another approach has had a modest beginning but is having some targeted impact. It’s the Pro-Truth Pledge project founded by Dr Gleb Tsipursky and Agnes Vishnevkin. Tsipursky is the author of The Truth-Seeker’s Handbook: A Science-Based Guide. He has also launched a Pro-Truth Movement.

Now most Australians of a certain age will associate taking the pledge with giving up drinking. While it’s not as powerful predictor of behaviour as a bout of pancreatitis it has proved to be successful as AA and others have shown.

In the US chastity pledges (whatever you may think of them) and college honour pledges have also seen some success.

Tsipursky also discusses some of the neuro-scientific research which illuminates the problem looking at the illusory truth effect, cognitive biases and the narrative fallacy.

The Pledge principles also highlight factors which determine whether someone lies or not such as: “defining clear parameters of what constitutes truth-oriented behaviour; belonging to a community of truth-oriented individuals; getting positive reputational and social status rewards from truth-oriented behaviours that bear costs and/or feel uncomfortable, such as admitting one’s mistakes; and, associating positive emotions and values with truthfulness.”

They also hope that “politicians who are warned that they will be held accountable for truthfulness are less likely to lie” but on the US and Australian records that’s debatable.

Share and Enjoy !

Subscribe to John Menadue's Newsletter
Subscribe to John Menadue's Newsletter

 

Thank you for subscribing!