Altay (2022)
Altay (2022)
Altay (2022)
Sacha Altay
Department of Political Science, University of Zurich, Switzerland.
Abstract
Efforts to combat misinformation have intensified in recent years. In parallel, our scientific
understanding of misinformation and of our information ecosystem has improved. Here, I
propose ways to improve interventions against misinformation based on this growing body of
knowledge. First, because misinformation consumption is minimal and news consumption is
low, more interventions should aim at increasing the uptake of reliable information. Second,
because most people distrust unreliable sources but fail to sufficiently trust reliable sources,
there is more room to improve trust in reliable sources than to reduce trust in unreliable sources.
Third, because misinformation is largely a symptom of deeper socio-political problems,
interventions should try to address these root causes, such as by reducing partisan animosity.
Fourth, because a small number of powerful individuals give misinformation most of its
visibility, interventions should try to target these ‘superspreaders’. Fifth, because false
information is not necessarily harmful and true information can be used in misleading ways,
misleadingness should take precedence over veracity in defining misinformation.
Policymakers, journalists, and researchers would benefit from considering these arguments
when thinking about the problem of misinformation and how to tackle it.
c) People distrust unreliable news outlets but fail to trust reliable C) There is much more room to improve trust in reliable sources than
People are overly news outlets enough. to reduce trust in unreliable sources.
skeptical d) People underuse social informaiton and are more likely to D) Instead of teaching laypeople to be more critical, it would be more
reject reliable information than to accept misinformation. fruitful to make them more trusting.
e) Counter-discourses fostered by misinformation attract people E) There will be misinformation as long as people do not trust
Misinformation is a with low trust in institutions or high partisan animosity. institutions and are affectively polarized.
symptom f) Misinformation is more prevalent in countries with weak F) Building trust in institutions and reducing partisan animosity could
democratic institutions and high political polarization. help address some of the root causes.
Misinformation often g) A small number of powerful individuals give misinformation G) Efforts to combat misinformation should not rest primarily on
comes from the top most of its visibility. laypeople’s shoulders and need to target superpreaders.
False information is not h) False information can be used in non-misleading ways while H) Interventions against misinformation should focus on misleading
necessarily harmful true information can be used in misleading ways. statemnet rather than false statements per se.
Table 1. The main arguments are laid out in column one, the rationale behind the
arguments in column two, and the implications of the arguments for interventions
against misinformation in column three.
2. Five arguments on misinformation
3.2. Nudges
One of the most widely used nudges against misinformation by social media companies in
recent years is friction. It has been implemented in various ways, but the core principle is to
increase the number of clicks required to share content, such as by asking users whether they
want to read the article they want to share before sharing it. Friction is very effective at deterring
sharing [52] and requires only minimal tweaks in the design of social media platforms. The
main problem is that friction risk reducing not only the sharing of misinformation, but also the
sharing of reliable information. And since people are more exposed to reliable information than
to misinformation, friction will mostly reduce the sharing of reliable information. This is
concerning given that, contrary to the negative effects of misinformation, the positive effects
of access to reliable information are well documented [66]. It is thus important to develop and
test forms of friction targeting specifically unreliable news. But since friction will inevitably
reduce sharing in general, efforts should be made to compensate for this decrease, notably by
promoting the sharing of news from reliable sources on social media—which might be
particularly important to reach the least interested in the news who get most of their news
incidentally when scrolling on social media [67].
The most famous nudge against misinformation in the scientific literature is the accuracy
nudge. Simply asking participants to rate the accuracy of headlines before sharing them reduces
their willingness to share false news and increases (to a smaller extent) their willingness to
share true news [68]. The accuracy nudge is easy to implement and has been shown to be
effective across countries [69]. Yet, to be effective outside of experimental settings, and in the
long run, it needs to be paired with other interventions to minimize habituation effects [70].
Finally, source labels providing information about the reliability of a source have been
developed by some companies (e.g. NewsGuard) and have been embraced by some search
engines (e.g. Microsoft Edge). Source labels are easy to implement but evidence for their
effectiveness is mixed [71]. Instead of labeling unreliable sources—which most people already
distrust—it could be more fruitful to label mainstream media sources—which people fail to
trust as much as fact-checkers do.
5. Conclusion
To be clear, I do not argue that efforts to curb the spread of misinformation should be reduced,
or that current interventions against misinformation are futile. In various experimental settings
fact-checking, nudges, and media literacy training have proven to be effective. The effect sizes
are rather small, and often short-lived. Their ecological validity is limited, and their
generalizability in the Global South is uncertain. Yet, I do not dispute that fact-checking,
nudges, and media literacy training can help. Instead, this contribution should be taken as an
invitation to think differently about the problem of misinformation. In particular, policymakers,
journalists, and researchers should not consider misinformation in isolation, e.g. as a purely
cognitive problem, but in the context of people’s overall low trust in the media and partisan
animosity.
The main limitation of this article is that it mostly relies on evidence from Western
democracies. On the one hand, some arguments made in this article weaken when applied to
the Global South. In particular, many have expressed concerned that misinformation
consumption is higher in the Global South than in Western democracies. Thus, everything else
being equal, interventions against misinformation may be more effective in the Global South.
Moreover, increasing trust in the news would be vain in countries where access to high-quality
news is difficult or heavily restricted, and it would be counterproductive in countries with low-
quality media ecosystems.
On the other hand, some arguments strengthen when applied to the Global South. For
instance, promoting reliable information may be particularly fruitful in countries with low-
quality media ecosystems, where finding out the good stuff amidst of all the bad stuff may be
harder and require more skills. Importantly, the need for systemic solutions is greater in some
Global South countries, where people have good reasons to distrust institutions, believe in
conspiracy theories and adhere to counter-discourses, given, among other things, the higher
levels of corruption and lower levels press freedom [103].
In conclusion, I hope this article will help interventions against misinformation adapt to the
growing body of knowledge on misinformation and our information ecosystem. First,
misinformation consumption is minimal in western democracies, but many people around the
world avoid the news and are not interested in politics. Many hold false beliefs because they
are uninformed rather than misinformed, stressing the importance of promoting reliable
information. Second, people are not gullible – if anything, they tend to be overly skeptical.
Instead of trying to make people more critical, it would be more fruitful to build trust in reliable
sources. Third, in Western democracies people hardly ever share misinformation publicly
online, but the minority of active users who regularly share it have deep-seated motivations that
will be difficult to address. Similarly, individual-level interventions like digital literacy
programs will not be enough to fight misinformation in the Global South, systemic
interventions are needed to address the root causes of the problem. Fourth, harmful
misinformation often comes from powerful actors. Efforts to combat misinformation should
not rest primarily on laypeople’s shoulder and need to target superspreaders as well. Fifth, false
information is not necessarily harmful and true information can be used in misleading ways, it
is therefore paramount to tackle misleading information, regardless of its truthfulness. Overall,
more efforts should be devoted to promoting reliable information, building trust in reliable
sources, cultivating interest in news and politics, increasing elites’ accountability and reducing
partisan animosity.
References
[1] Newman N, Fletcher R, Schulz A, et al. Digital news report 2021. Reuters Institute for
the Study of Journalism.
[2] Allen J, Howland B, Mobius M, et al. Evaluating the fake news problem at the scale of
the information ecosystem. Science Advances 2020; 6: eaay3539.
[3] Altay S, Berriche M, Acerbi A. Misinformation on misinformation: Conceptual and
methodological challenges. Social Media+ Society 2023; 9: 20563051221150412.
[4] Ziemer C-T, Rothmund T. Psychological underpinnings of disinformation
countermeasures: A systematic scoping review.
[5] Gwiaździński P, Gundersen AB, Piksa M, et al. Psychological interventions
countering misinformation in social media: A scoping review. Frontiers in Psychiatry 2023;
13: 974782.
[6] Roozenbeek J, Culloty E, Suiter J. Countering Misinformation. European
Psychologist.
[7] Guay B, Pennycook G, Rand D. How To Think About Whether Misinformation
Interventions Work.
[8] Modirrousta-Galian A, Higham PA. Gamified inoculation interventions do not
improve discrimination between true and fake news: Reanalyzing existing research with
receiver operating characteristic analysis. Journal of Experimental Psychology: General.
[9] Altay S, Nielsen RK, Fletcher R. Quantifying the “infodemic”: People turned to
trustworthy news outlets during the 2020 pandemic. Journal of Quantitative Description:
Digital Media. Epub ahead of print 2022. DOI: https://doi.org/10.51685/jqd.2022.020.
[10] Cordonier L, Brest A. How do the French inform themselves on the Internet? Analysis
of online information and disinformation behaviors. Fondation Descartes,
https://hal.archives-ouvertes.fr/hal-03167734/document (2021).
[11] Grinberg N, Joseph K, Friedland L, et al. Fake news on twitter during the 2016 US
Presidential election. Science 2019; 363: 374–378.
[12] Guess A, Nagler J, Tucker J. Less than you think: Prevalence and predictors of fake
news dissemination on Facebook. Science advances 2019; 5: eaau4586.
[13] Osmundsen M, Bor A, Vahlstrup PB, et al. Partisan polarization is the primary
psychological motivation behind political fake news sharing on Twitter. American Political
Science Review 2021; 1–17.
[14] Allen J, Mobius M, Rothschild DM, et al. Research note: Examining potential bias in
large-scale censored data. Harvard Kennedy School Misinformation Review.
[15] Guess A, Aslett K, Tucker J, et al. Cracking Open the News Feed: Exploring What US
Facebook Users See and Share with Large-Scale Platform Data. Journal of Quantitative
Description: Digital Media; 1. Epub ahead of print 2021. DOI:
https://doi.org/10.51685/jqd.2021.006.
[16] Newman N, Fletcher R, Robertson C, et al. Reuters Institute digital news report 2022.
Reuters Institute for the Study of Journalism.
[17] Wojcieszak M, de Leeuw S, Menchen-Trevino E, et al. No Polarization From Partisan
News: Over-Time Evidence From Trace Data. The International Journal of Press/Politics
2021; 19401612211047190.
[18] Acerbi A, Altay S, Mercier H. Research note: Fighting misinformation or fighting for
information? Harvard Kennedy School (HKS) Misinformation Review. Epub ahead of print
2022. DOI: https://doi.org/10.37016/mr-2020-87.
[19] Clayton K, Blair S, Busam JA, et al. Real solutions for fake news? Measuring the
effectiveness of general warnings and fact-check tags in reducing belief in false stories on
social media. Political Behavior 2020; 42: 1073–1095.
[20] Guess A, Lerner M, Lyons B, et al. A digital media literacy intervention increases
discernment between mainstream and false news in the United States and India. Proceedings
of the National Academy of Sciences 2020; 117: 15536–15545.
[21] Fletcher R, Nielsen RK. Generalised scepticism: how people navigate news on social
media. Information, Communication & Society 2019; 22: 1751–1769.
[22] Pennycook G, Rand DG. Fighting misinformation on social media using
crowdsourced judgments of news source quality. Proceedings of the National Academy of
Sciences 2019; 116: 2521–2526.
[23] Schulz A, Fletcher R, Popescu M. Are News Outlets Viewed in the Same Way by
Experts and the Public ? A Comparison across 23 European Countries. Reuters institute
factsheet, https://reutersinstitute.politics.ox.ac.uk/are-news-outlets-viewed-same-way-experts-
and-public-comparison-across-23-european-countries (2020).
[24] Batailler C, Brannon SM, Teas PE, et al. A signal detection approach to understanding
the identification of fake news. Perspectives on Psychological Science 2022; 17: 78–98.
[25] Bryanov K, Vziatysheva V. Determinants of individuals’ belief in fake news: A
scoping review determinants of belief in fake news. PLoS one 2021; 16: e0253717.
[26] Pfänder J, Altay S. Spotting Fake News and Doubting True News: A Meta-Analysis of
News Judgements.
[27] Jahanbakhsh F, Zhang AX, Berinsky AJ, et al. Exploring lightweight interventions at
posting time to reduce the sharing of misinformation on social media. Proceedings of the
ACM on Human-Computer Interaction 2021; 5: 1–42.
[28] Morin O, Jacquet PO, Vaesen K, et al. Social information use and social information
waste. Philosophical Transactions of the Royal Society B 2021; 376: 20200052.
[29] Taber CS, Lodge M. Motivated skepticism in the evaluation of political beliefs.
American Journal of Political Science 2006; 50: 755–769.
[30] Mercier H. Not Born Yesterday : The Science of Who We Trust and What We Believe.
Princeton University Press, 2020.
[31] Boyd D. Did media literacy backfire? Journal of Applied Youth Studies 2017; 1: 83–
89.
[32] Mihailidis P. Beyond cynicism: How media literacy can make students more engaged
citizens. University of Maryland, College Park, 2008.
[33] Marwick A, Partin WC. Constructing Alternative Facts: Populist Expertise and the
QAnon Conspiracy. New Media & Society. Epub ahead of print 2022. DOI:
https://doi.org/10.1177/14614448221090201.
[34] Holt K, Ustad Figenschou T, Frischlich L. Key dimensions of alternative news media.
Digital Journalism 2019; 7: 860–869.
[35] Drążkiewicz E. Study conspiracy theories with compassion. Nature 2022; 603: 765–
765.
[36] Zimmermann F, Kohring M. Mistrust, disinforming news, and vote choice: A panel
survey on the origins and consequences of believing disinformation in the 2017 German
parliamentary election. Political Communication 2020; 37: 215–237.
[37] Freeman D, Waite F, Rosebrock L, et al. Coronavirus conspiracy beliefs, mistrust, and
compliance with government guidelines in England. Psychological medicine 2022; 52: 251–
263.
[38] Guess A, Nyhan B, Reifler J. Exposure to untrustworthy websites in the 2016 US
election. Nature human behaviour 2020; 4: 472–480.
[39] Lyons B, Montgomery J, Reifler J. Partisanship and older Americans’ engagement
with dubious political news. Psyarxiv. Epub ahead of print 2023. DOI: 10.31219/osf.io/etb89.
[40] Boyadjian J. Désinformation, non-information ou sur-information? Reseaux 2020; 21–
52.
[41] Bennett L, Livingston S. A Brief History of the Disinformation Age: Information
Wars and the Decline of Institutional Authority. The disinformation age. Epub ahead of print
2020. DOI: 10.1017/9781108914628.001.
[42] Nyhan B. Why the backfire effect does not explain the durability of political
misperceptions. Proceedings of the National Academy of Sciences; 118.
[43] CCDH. The disinformation dozen. 2021,
https://www.theguardian.com/world/2021/jul/17/covid-misinformation-conspiracy-theories-
ccdh-report (2021).
[44] Tsfati Y, Boomgaarden H, Strömbäck J, et al. Causes and consequences of
mainstream media dissemination of fake news: literature review and synthesis. Annals of the
International Communication Association 2020; 1–17.
[45] Altay S, Nielsen RK, Fletcher R. The impact of news media and digital platform use
on awareness of and belief in COVID-19 misinformation. Epub ahead of print 2022. DOI:
10.31234/osf.io/7tm3s.
[46] Søe SO. A unified account of information, misinformation, and disinformation.
Synthese 2021; 198: 5929–5949.
[47] Boyd D, Crawford K. Critical questions for big data: Provocations for a cultural,
technological, and scholarly phenomenon. Information, communication & society 2012; 15:
662–679.
[48] Walter N, Murphy ST. How to unring the bell: A meta-analytic approach to correction
of misinformation. Communication Monographs 2018; 85: 423–441.
[49] Wood T, Porter E. The elusive backfire effect: Mass attitudes’ steadfast factual
adherence. Political Behavior 2019; 41: 135–163.
[50] Carey JM, Guess AM, Loewen PJ, et al. The ephemeral effects of fact-checks on
COVID-19 misperceptions in the United States, Great Britain and Canada. Nature Human
Behaviour 2022; 1–8.
[51] Berlinski N, Doyle M, Guess AM, et al. The effects of unsubstantiated claims of voter
fraud on confidence in elections. Journal of Experimental Political Science 2021; 1–16.
[52] Barrera O, Guriev S, Henry E, et al. Facts, alternative facts, and fact checking in times
of post-truth politics. Journal of Public Economics 2020; 182: 104123.
[53] Porter E, Wood TJ, Velez Y. Correcting COVID-19 Vaccine Misinformation in Ten
Countries. Royal Society open science. Epub ahead of print 2023. DOI: 10.1098/rsos.221097.
[54] Nyhan B, Porter E, Reifler J, et al. Taking Fact-checks Literally But Not Seriously?
The Effects of Journalistic Fact-checking on Factual Beliefs and Candidate Favorability.
Political Behavior 2019; 1–22.
[55] Allen J, Arechar AA, Pennycook G, et al. Scaling up fact-checking using the wisdom
of crowds. Science Advances.
[56] Godel W, Sanderson Z, Aslett K, et al. Moderating with the Mob: Evaluating the
Efficacy of Real-Time Crowdsourced Fact-Checking. Journal of Online Trust and Safety; 1.
Epub ahead of print 2021. DOI: https://doi.org/10.54501/jots.v1i1.15.
[57] Allen JNL, Martel C, Rand D. Birds of a feather don’t fact-check each other:
Partisanship and the evaluation of news in Twitter’s Birdwatch crowdsourced fact-checking
program. Psyarxiv. Epub ahead of print 2021. DOI: 10.31234/osf.io/57e3q.
[58] Berriche M. En quête de sources. Politiques de communication 2021; 115–154.
[59] Carey J, Guess A, Nyhan B, et al. COVID-19 Misinformation Consumption is
Minimal, Has Minimal Effects, And Does Not Prevent Fact-Checks from Working.
[60] Park S, Park JY, Kang J, et al. The presence of unexpected biases in online fact-
checking. The Harvard Kennedy School Misinformation Review. Epub ahead of print 2021.
DOI: 10.37016/mr-2020-53.
[61] Broockman D, Kalla J. The manifold effects of partisan media on viewers’ beliefs and
attitudes: A field experiment with Fox News viewers. OSF Preprints; 1. Epub ahead of print
2022. DOI: 10.31219/osf.io/jrw26.
[62] Weeks BE, Menchen-Trevino E, Calabrese C, et al. Partisan media, untrustworthy
news sites, and political misperceptions. New Media & Society 2021; 14614448211033300.
[63] Guess AM, Barberá P, Munzert S, et al. The consequences of online partisan media.
Proceedings of the National Academy of Sciences; 118. Epub ahead of print 2021. DOI:
https://doi.org/10.1073/pnas.2013464118.
[64] Pennycook G, Rand DG. Lazy, not biased: Susceptibility to partisan fake news is
better explained by lack of reasoning than by motivated reasoning. Cognition 2019; 188: 39–
50.
[65] Shin J, Thorson K. Partisan selective sharing: The biased diffusion of fact-checking
messages on social media. Journal of Communication 2017; 67: 233–255.
[66] Snyder JM, Strömberg D. Press Coverage and Political Accountability. Journal of
Political Economy 2010; 118: 355–408.
[67] Fletcher R, Nielsen RK. Are people incidentally exposed to news on social media? A
comparative analysis. New media & society 2018; 20: 2450–2468.
[68] Pennycook G, Epstein Z, Mosleh M, et al. Shifting attention to accuracy can reduce
misinformation online. Nature 2021; 592: 590–595.
[69] Arechar AA, Allen J, Berinsky AJ, et al. Understanding and combatting
misinformation across 16 countries on six continents. Nature Human Behaviour 2023; 1–12.
[70] Roozenbeek J, van der Linden S. How to Combat Health Misinformation: A
Psychological Approach. American Journal of Health Promotion 2022; 36: 569–575.
[71] Aslett K, Guess AM, Bonneau R, et al. News credibility labels have limited average
effects on news diet quality and fail to reduce misperceptions. Science Advances 2022; 8:
eabl3844.
[72] Lewandowsky S, Ecker UK, Seifert CM, et al. Misinformation and its correction:
Continued influence and successful debiasing. Psychological Science in the Public Interest
2012; 13: 106–131.
[73] Roozenbeek J, van der Linden S, Nygren T. Prebunking interventions based on the
psychological theory of “inoculation” can reduce susceptibility to misinformation across
cultures. Harvard Kennedy School Misinformation Review; 1.
[74] Badrinathan S. Educative Interventions to Combat Misinformation: Evidence from a
Field Experiment in India. American Political Science Review 2021; 1–17.
[75] Vraga E, Tully M, Bode L. Assessing the relative merits of news literacy and
corrections in responding to misinformation on Twitter. New Media & Society 2021;
1461444821998691.
[76] Epstein Z, Berinsky AJ, Cole R, et al. Developing an accuracy-prompt toolkit to
reduce COVID-19 misinformation online. Harvard Kennedy School Misinformation Review.
Epub ahead of print 2021. DOI: https://doi.org/10.37016/mr-2020-71.
[77] Compton J, van der Linden S, Cook J, et al. Inoculation theory in the post‐truth era:
Extant findings and new frontiers for contested science, misinformation, and conspiracy
theories. Social and Personality Psychology Compass 2021; 15: e12602.
[78] Gilardi F. Digital Technology, Politics, and Policy-Making. Elements in Public Policy.
Epub ahead of print 2022. DOI: https://doi.org/10.5167/uzh-218931.
[79] Coppock A. Persuasion in parallel: How information changes minds about politics.
University of Chicago Press, 2023.
[80] Motta M, Hwang J, Stecula D. What Goes Down Must Come Up? Misinformation
Search Behavior During an Unplanned Facebook Outage. Epub ahead of print 2022. DOI:
10.31235/osf.io/pm9gy.
[81] Chen A, Nyhan B, Reifler J, et al. Subscriptions and external links help drive resentful
users to alternative and extremist YouTube videos. Epub ahead of print 2022. DOI:
https://doi.org/10.48550/arXiv.2204.10921.
[82] Bennett WL, Livingston S. The disinformation order: Disruptive communication and
the decline of democratic institutions. European journal of communication 2018; 33: 122–
139.
[83] de Wildt L, Aupers S. Participatory conspiracy culture: Believing, doubting and
playing with conspiracy theories on Reddit. Convergence 2023; 13548565231178914.
[84] Newman D, Lewandowsky S, Mayo R. Believing in nothing and believing in
everything: The underlying cognitive paradox of anti-COVID-19 vaccine attitudes.
Personality and Individual Differences 2022; 189: 111522.
[85] Tomas F, Nera K, Schöpfer C. “Think for Yourself, or Others Will Think for You”:
Epistemic Individualism Predicts Conspiracist Beliefs and Critical Thinking. Psyarxiv. Epub
ahead of print 2022. DOI: 10.31219/osf.io/qgtzb.
[86] Dieguez S, Wagner-Egger P. Réflexions sur la forme de la Terre. L’irrationnel
aujourd’hui 2021; 323–400.
[87] Mercier H, Altay S. Do cultural misbeliefs cause costly behavior? In Musolino, J.,
Hemmer, P. & Sommer, J. (Eds.). The Science of Beliefs, 2022.
[88] Altay S, Mercier H. Framing messages for vaccination supporters. Journal of
Experimental Psychology: Applied 2020; 26: 567–578.
[89] Goldberg MH, van der Linden S, Maibach E, et al. Discussing global warming leads
to greater acceptance of climate science. Proceedings of the National Academy of Sciences
2019; 116: 14804–14805.
[90] Ivanov B, Miller CH, Compton J, et al. Effects of postinoculation talk on resistance to
influence. Journal of Communication 2012; 62: 701–718.
[91] Katz E, Lazarsfeld PF. Personal influence: The part played by people in the flow of
mass communications. Glencoe: Free Press, 1955.
[92] Bail C. Breaking the social media prism. Princeton University Press, 2021.
[93] Mihailidis P, Viotty S. Spreadable spectacle in digital culture: Civic expression, fake
news, and the role of media literacies in “post-fact” society. American behavioral scientist
2017; 61: 441–454.
[94] Hartman R, Blakey JW, Womick J, et al. Interventions to Reduce Partisan Animosity.
Nature Human Behaviour. Epub ahead of print 2022. DOI: https://doi.org/10.1038/s41562-
022-01442-3.
[95] Nyhan B, Reifler J. The effect of fact‐checking on elites: A field experiment on US
state legislators. American Journal of Political Science 2015; 59: 628–640.
[96] Lim C. Can Fact-checking Prevent Politicians from Lying? Disponible en.
[97] Metzger MJ, Flanagin AJ, Medders RB. Social and Heuristic Approaches to
Credibility Evaluation Online. Journal of Communication 2010; 60: 413–439.
[98] Pasquinelli E, Farina M, Bedel A, et al. Naturalizing Critical Thinking: Consequences
for Education, Blueprint for Future Research in Cognitive Science. Mind, Brain, and
Education.
[99] Hahn U, Oaksford M. The rationality of informal argumentation: A bayesian approach
to reasoning fallacies. Psychological Review 2007; 114: 704–732.
[100] Roozenbeek J, Maertens R, Herzog SM, et al. Susceptibility to misinformation is
consistent across question framings and response modes and better explained by open-
mindedness and partisanship than analytical thinking. Judgment and Decision Making.
[101] Marie A, Petersen MB. Moralization of rationality can stimulate, but intellectual
humility inhibits, sharing of hostile conspiratorial rumors. Psyarxiv. Epub ahead of print
2022. DOI: https://osf. io/k7u68.
[102] Adam‐Troian J, Chayinska M, Paladino MP, et al. Of precarity and conspiracy:
Introducing a socio‐functional model of conspiracy beliefs. British journal of social
psychology 2023; 62: 136–159.
[103] Alper S. There are higher levels of conspiracy beliefs in more corrupt countries.
European Journal of Social Psychology. Epub ahead of print 2022. DOI: 10.1002/ejsp.2919.
[104] Cordonier L, Cafiero F. Public Sector Corruption is Fertile Ground for Conspiracy
Beliefs: A Comparison Between 26 Western and Non-Western Countries. Psyarxiv. Epub
ahead of print 2023. DOI: 10.31219/osf.io/b24gk.
[105] Cordonier L, Cafiero F, Bronner G. Why are conspiracy theories more successful in
some countries than in others? An exploratory study on Internet users from 22 Western and
non-Western countries. Social Science Information 2021; 60: 436–456.
[106] Kuo R, Marwick A. Critical disinformation studies: History, power, and politics.
Harvard Kennedy School Misinformation Review 2021; 2: 1–11.
[107] Marwick AE. Why do people share fake news? A sociotechnical model of media
effects. Georgetown law technology review 2018; 2: 474–512.
[108] Chater N, Loewenstein G. The i-frame and the s-frame: How focusing on individual-
level solutions has led behavioral public policy astray. Behavioral and Brain Sciences 2022;
1–60.
[109] Jungherr A, Schroeder R. Disinformation and the Structural Transformations of the
Public Arena: Addressing the Actual Challenges to Democracy. Social Media+ Society 2021;
7: 2056305121988928.