It has long been a puzzle to the deficit model–which suggests that trust in science is primarily driven by science knowledge–that knowledge of science is at best weakly associated with science attitudes (Allum et al. 2008; National Academies of Sciences, Engineering, and Medicine 2016). The rational impression account can make sense of this: it lays out how trusting science without recalling specific knowledge can be the result of a sound inference process, rooted in basic cognitive mechanisms of information evaluation.
The account is compatible with the finding that education, and in particular science education, has been repeatedly identified as one of the strongest correlates of trust in science (Bak 2001; Noy and O’Brien 2019; Wellcome Global Monitor 2018, 2020; but see Cologna et al. 2025). By contrast with the deficit model, it suggests that the main causal role of education for public trust in science is not transmission of knowledge and understanding, but impression generation.
The rational impression account aligns with recent normative accounts which shift the focus from listing particular key institutional features that make science trustworthy-certain methods, norms, or processes–to the diversity of science: Cartwright et al. (2022) make the case that scientific knowledge emerges from a tangle of results, relying on diverse research methods. Oreskes (2019) makes a similar case: She argues that scientific practice takes place in different scientific communities who rely on a variety of different research methods. Through some shared practices, in particular peer-review, these communities engage in critical dialogue. What makes scientific knowledge trustworthy, according to Oreskes, is when from this diversity of actors and methods, a consensus emerges. According to this view, to infer trustworthiness, people should have a representation of the diversity of science. The rational impression account is, in a way, less strict: it does not require a representation of diversity. However, for inferences from convergence to be sound, people do need to have a representation of science as an institution of independent thinkers.
The studies presented in this thesis face several limitations which are detailed in the individual chapters. Here, I will point out some more general, theoretical limitations of the rational impression account.
The rational impression account fits with a sociological literature investigating how “individual cognition and practice establish and maintain institutional fields and status hierarchies, especially in the face of imperfect knowledge” (Gauchat and Andrews 2018, 569). It proposes a possible micro-level model of trust in science and should be seen as complementing, not competing with, macro-level processes that shape public trust in science. Sociological macro-level accounts have made the case that trust in science is entangled with broader cultural and political dynamics. These accounts, like the individual-level accounts reviewed above, tend to focus on explaining distrust in science. For example, Gauchat (2011) describes the ‘alienation model’, according to which the “public disassociation with science is a symptom of a general disenchantment with late modernity, mainly, the limitations associated with codified expertise, rational bureaucracy, and institutional authority” (Gauchat 2011, 2). This explanation builds on the work of social theorists (Habermas 1989; Beck 1992; Giddens 1991; see Gauchat 2011 for an overview) who suggested that a modern, complex world increasingly requires expertise, and thus shapes institutions of knowledge elites. People who are not part of these institutions experience a lack of agency, resulting in a feeling of alienation. Similarly, Gauchat (2023) argues that politicization of science in the US needs to be seen in its broader cultural context. Precisely, according to Gauchat, science has enabled the authority of the modern regulatory state. Consequently, conservative distrust of science reflects deeper structural tensions with the institutions and rational–legal authority of modern governance. At the micro-level, this is consistent with research showing that right-wing authoritarian ideology is associated with distrust towards science and scientists (Kerr and Wilson 2021).
Another limitation of the rational impression account is that it assumes people have a representation of science as consensual. However, in practice–with perhaps some exceptions, such as during the Covid-19 pandemic–most people do not literally compare the opinions of different scientists for themselves and come to the conclusion that something is largely consensual. Where, then, could the representation of consensus possibly emerge? A plausible explanation, I believe, is that education fosters a representation of consensus: During education, in particular during early education, knowledge is typically presented as simply the result of science–a seemingly unanimous enterprise that produces knowledge. School books hardly teach about historical science controversies, suggest uncertainty around scientific findings, or cover cutting-edge research where disagreements are the norm. This could induce a default consensus assumption in people’s perceptions of science. However, this argument is of course only speculative.
The rational impression account is also limited in its implications. First, I do not believe that flooding people with impressive consensual science knowledge is the key to overcoming all distrust in science. In the context of trust in political institutions, research has shown that trust and distrust are not necessarily symmetrical: what causes the former might not help alleviate the latter (Bertsou 2019). I believe this is at least to some degree true for science, too. Especially in a context of the global north, where essentially everyone has been exposed to science through a basic science education, trust in science via the mechanisms of the rational impression account is likely to be the default state. This is in line with our findings on quasi-universal trust in basic science in the US (Chapter 6). Consensus messaging has been shown to help convince people to trust science on particular issues, such as climate change or vaccines, but it is less clear whether consensus messaging could also enhance perceptions of trustworthiness. It might be the case that the people convinced by consensus messages might have already generally trusted science, but have not held strong opinions on the specific matter, as has been argued to be the case for a large segment of the public on most matters (Bourdieu 1979; Zaller 1992). For people who do not only lack trust, but who actively distrust, motivated reasoning accounts are likely better suited as a theoretical framework. Addressing relevant underlying motivations directly might be more fruitful to mitigate distrust in science than exposing people to consensual science more generally.
Second, and related, just because I propose an account by which trust in science can be rational, this does not mean that, conversely, all distrust in science is irrational. Some groups of people do in fact have good reasons not to trust science. For example, some science has historically contributed to fostering racism (see e.g. Fuentes 2023; Nobles et al. 2022), via instances such as the tragically famous Tuskegee syphilis study (Brandt 1978; Scharff et al. 2010).
Third, I do not think that science communication should stress consensus at all costs. In the rational impression account, consensus plays a central role for generating trust. However, this should not incentivize science communicators to neglect transparency about uncertainty. Acknowledging uncertainty in science communication has been argued to be crucial for fostering long term trust in science (Druckman 2015). For example, in the context of Covid-19 vaccines, Petersen et al. (2021) have shown that communicating uncertainty is crucial for building long term trust in health authorities.
Fourth, science communication should not aim for impressiveness at all costs either. Research has shown that intellectual humility can increase trust in scientists (Koetke et al. 2024). Trying to oversell scientific results might therefore backfire. People appear to value transparency via open data practices in science (Song, Markowitz, and Taylor 2022), and trust science that replicates more (Hendriks, Kienhues, and Bromme 2020). I would therefore expect that simply doing better, more transparent science and being humble about it is likely to be the most effective strategy to impress the public and elicit perceptions of trustworthiness.
Fifth, educators should not stop aiming at fostering a proper understanding of science. Most students might not understand all of the content, or recall much specific knowledge later on. However, for some students at least, some of that knowledge will be remembered, and will prove important in their lives. Second, to be impressive, a piece of information does not need to be confusingly complex. In fact, a proper understanding of research findings and their methods might even help in appreciating their complexity–even if, once again, that understanding is forgotten later.
Despite these caveats, I believe that the rational impressions account offers optimism for studies of science-society interfaces, and the field of science communication in particular: Exposure to science, especially one that leaves an impression, might be the foundation of public trust in science. Low scientific literacy levels should not discourage education and communication efforts, as they are not necessarily a good indicator of the value added in terms of fostering trust in science.
Taking a broader perspective, our account fits into a picture of humans as not gullible (Mercier 2017, 2020). The “failure” of the deficit model, i.e. the fact that science knowledge appears to not be strongly associated with trust in science, might suggest that public trust in science is, to a large extent, irrational. The notion that trust in science is irrational or easily granted may amplify concerns about the impact of misinformation: if trust lacks a solid, rational foundation, then we would expect misinformation to easily lead people astray. There is much work to be done still to understand how misinformation impacts people’s beliefs, and in particular elite-driven misinformation and more subtle forms of misinformation, such as one-sided reporting. But in Chapter 7, we show that people are generally able to distinguish between true and false news and, if anything, tend to be generally skeptical of news. As a consequence, for a better informed public, fighting for (true) information seems at least as relevant as fighting against misinformation. Misinformation researchers increasingly acknowledge this: A recent report on science misinformation by the National Science Foundation (National Academies of Sciences 2024) dedicates considerable space on developing strategies to produce better information, for example by promoting high-quality science, health, and medical journalism.
The rational impression account stresses the role of fighting for information, when it comes to fostering trust in science. Well-placed trust in science does not require profound understanding or recall of specific knowledge; but it does require exposure to good science.
*Ali, Khudejah, Cong Li, Khawaja Zain-ul-abdin, and Muhammad Adeel Zaffar. 2022. “Fake News on Facebook: Examining the Impact of Heuristic Cues on Perceived Credibility and Sharing Intention.” Internet Research 32 (1): 379–97.
Allen, Jennifer, Antonio A. Arechar, Gordon Pennycook, and David G. Rand. 2021. “Scaling up Fact-Checking Using the Wisdom of Crowds.” Science Advances 7 (36): eabf4393.
Allum, Nick, Patrick Sturgis, Dimitra Tabourazi, and Ian Brunton-Smith. 2008.
“Science Knowledge and Attitudes Across Cultures: A Meta-Analysis.” Public Understanding of Science 17 (1): 35–54.
https://doi.org/10.1177/0963662506070159.
*Altay, Sacha, Andrea De Angelis, and Emma Hoes. 2024.
“Media Literacy Tips Promoting Reliable News Improve Discernment and Enhance Trust in Traditional Media.” Communications Psychology 2 (1): 1–9.
https://doi.org/10.1038/s44271-024-00121-5.
*Altay, Sacha, and Fabrizio Gilardi. n.d.
“People Are Skeptical of Headlines Labeled as AI-Generated, Even If True or Human-Made, Because They Assume Full AI Automation.” https://doi.org/10.31234/osf.io/83k9r.
*Altay, Sacha, Benjamin A. Lyons, and Ariana Modirrousta-Galian. 2024.
“Exposure to Higher Rates of False News Erodes Media Trust and Fuels Overconfidence.” Mass Communication and Society, August, 1–25.
https://doi.org/10.1080/15205436.2024.2382776.
*Altay, Sacha, Rasmus Kleis Nielsen, and Richard Fletcher. 2022.
“The Impact of News Media and Digital Platform Use on Awareness of and Belief in COVID-19 Misinformation.” https://doi.org/10.31234/osf.io/7tm3s.
Altay, Sacha, Emma de Araujo, and Hugo Mercier. 2022.
““If This Account Is True, It Is Most Enormously Wonderful”: Interestingness-If-True and the Sharing of True and False News.” Digital Journalism 10 (3): 373–94.
https://doi.org/10.1080/21670811.2021.1941163.
*Arechar, Antonio A., Jennifer Allen, Adam J. Berinsky, Rocky Cole, Ziv Epstein, Kiran Garimella, Andrew Gully, et al. 2023.
“Understanding and Combatting Misinformation Across 16 Countries on Six Continents.” Nature Human Behaviour 7 (9): 1502–13.
https://doi.org/10.1038/s41562-023-01641-6.
*Arin, K. Peren, Deni Mazrekaj, and Marcel Thum. 2023.
“Ability of Detecting and Willingness to Share Fake News.” Scientific Reports 13 (1): 7298.
https://doi.org/10.1038/s41598-023-34402-6.
Aslett, Kevin, Zeve Sanderson, William Godel, Nathaniel Persily, Jonathan Nagler, and Joshua A. Tucker. 2024.
“Online Searches to Evaluate Misinformation Can Increase Its Perceived Veracity.” Nature 625 (7995): 548–56.
https://doi.org/10.1038/s41586-023-06883-y.
*Badrinathan, Sumitra. 2021.
“Educative Interventions to Combat Misinformation: Evidence from a Field Experiment in India.” American Political Science Review 115 (4): 1325–41.
https://doi.org/10.1017/S0003055421000459.
*Bago, Bence, David G. Rand, and Gordon Pennycook. 2020.
“Fake News, Fast and Slow: Deliberation Reduces Belief in False (but Not True) News Headlines.” Journal of Experimental Psychology: General 149 (8): 1608–13.
https://doi.org/10.1037/xge0000729.
*Bago, Bence, Leah R. Rosenzweig, Adam J. Berinsky, and David G. Rand. 2022.
“Emotion May Predict Susceptibility to Fake News but Emotion Regulation Does Not Seem to Help.” Cognition and Emotion, June, 1–15.
https://doi.org/10.1080/02699931.2022.2090318.
Bak, Hee-Je. 2001.
“Education and Public Attitudes Toward Science: Implications for the “Deficit Model” of Education and Support for Science and Technology.” Social Science Quarterly 82 (4): 779–95.
https://doi.org/10.1111/0038-4941.00059.
*Basol, Melisa, Jon Roozenbeek, Manon Berriche, Fatih Uenal, William P. McClanahan, and Sander van der Linden. 2021.
“Towards Psychological Herd Immunity: Cross-Cultural Evidence for Two Prebunking Interventions Against COVID-19 Misinformation.” Big Data & Society 8 (1): 205395172110138.
https://doi.org/10.1177/20539517211013868.
Beck, Ulrich. 1992. Risk society: towards a new modernity. Repr. Theory, culture & society. London: Sage.
Bertsou, Eri. 2019.
“Rethinking Political Distrust.” European Political Science Review 11 (2): 213–30.
https://doi.org/10.1017/S1755773919000080.
Bourdieu, Pierre. 1979. “Public Opinion Does Not Exist.” Communication and Class Struggle 1: 124–30.
Brandt, Allan M. 1978.
“Racism and Research: The Case of the Tuskegee Syphilis Study.” The Hastings Center Report 8 (6): 21.
https://doi.org/10.2307/3561468.
*Brashier, Nadia M., Gordon Pennycook, Adam J. Berinsky, and David G. Rand. 2021.
“Timing Matters When Correcting Fake News.” Proceedings of the National Academy of Sciences 118 (5): e2020043118.
https://doi.org/10.1073/pnas.2020043118.
*Bronstein, Michael V., Gordon Pennycook, Adam Bear, David G. Rand, and Tyrone D. Cannon. 2019.
“Belief in Fake News Is Associated with Delusionality, Dogmatism, Religious Fundamentalism, and Reduced Analytic Thinking.” Journal of Applied Research in Memory and Cognition 8 (1): 108–17.
https://doi.org/10.1016/j.jarmac.2018.09.005.
Bryanov, Kirill, Reinhold Kliegl, Olessia Koltsova, Alex Miltsov, Sergei Pashakhin, Alexander Porshnev, Yadviga Sinyavskaya, Maksim Terpilovskii, and Victoria Vziatysheva. 2023.
“What Drives Perceptions of Foreign News Coverage Credibility? A Cross-National Experiment Including Kazakhstan, Russia, and Ukraine.” Political Communication 40 (2): 115–46.
https://doi.org/10.1080/10584609.2023.2172492.
Cartwright, Nancy, Jeremy Hardie, Eleonora Montuschi, Matthew Soleiman, and Ann C. Thresher. 2022. The Tangle of Science: Reliability Beyond Method, Rigour, and Objectivity. New York: Oxford University Press.
Chen, Xi, Gordon Pennycook, and David Rand. 2023. “What Makes News Sharable on Social Media?” Journal of Quantitative Description: Digital Media 3.
*Clayton, Katherine, Spencer Blair, Jonathan A. Busam, Samuel Forstner, John Glance, Guy Green, Anna Kawata, et al. 2020.
“Real Solutions for Fake News? Measuring the Effectiveness of General Warnings and Fact-Check Tags in Reducing Belief in False Stories on Social Media.” Political Behavior 42 (4): 1073–95.
https://doi.org/10.1007/s11109-019-09533-0.
Clemm Von Hohenberg, Bernhard. 2023.
“Truth and Bias, Left and Right: Testing Ideological Asymmetries with a Realistic News Supply.” Public Opinion Quarterly 87 (2): 267–92.
https://doi.org/10.1093/poq/nfad013.
Cologna, Viktoria, Niels G. Mede, Sebastian Berger, John Besley, Cameron Brick, Marina Joubert, Edward W. Maibach, et al. 2025.
“Trust in Scientists and Their Role in Society Across 68 Countries.” Nature Human Behaviour, January, 1–18.
https://doi.org/10.1038/s41562-024-02090-5.
Dias, Nicholas, Gordon Pennycook, and David G. Rand. 2020.
“Emphasizing Publishers Does Not Effectively Reduce Susceptibility to Misinformation on Social Media.” Harvard Kennedy School Misinformation Review, January.
https://doi.org/10.37016/mr-2020-001.
Druckman, James N. 2015.
“Communicating Policy-Relevant Science.” PS: Political Science & Politics 48 (S1): 58–69.
https://doi.org/10.1017/S1049096515000438.
Epstein, Ziv, Nathaniel Sirlin, Antonio Arechar, Gordon Pennycook, and David Rand. 2023. “The Social Media Context Interferes with Truth Discernment.” Science Advances 9 (9): eabo6169.
*Erlich, Aaron, and Calvin Garner. 2023.
“Is Pro-Kremlin Disinformation Effective? Evidence from Ukraine.” The International Journal of Press/Politics 28 (1): 5–28.
https://doi.org/10.1177/19401612211045221.
*Faragó, Laura, Péter Krekó, and Gábor Orosz. 2023.
“Hungarian, Lazy, and Biased: The Role of Analytic Thinking and Partisanship in Fake News Discernment on a Hungarian Representative Sample.” Scientific Reports 13 (1): 178.
https://doi.org/10.1038/s41598-022-26724-8.
*Fazio, Lisa, David Rand, Stephan Lewandowsky, Mark Susmann, Adam J. Berinsky, Andrew Guess, Panayiota Kendeou, et al. n.d.
“Combating Misinformation: A Megastudy of Nine Interventions Designed to Reduce the Sharing of and Belief in False and Misleading Headlines.” https://doi.org/10.31234/osf.io/uyjha.
Fuentes, Agustín. 2023.
“Systemic Racism in Science: Reactions Matter.” Science 381 (6655): eadj7675.
https://doi.org/10.1126/science.adj7675.
Garrett, R. Kelly, and Robert M. Bond. 2021.
“Conservatives’ Susceptibility to Political Misperceptions.” Science Advances 7 (23): eabf1234.
https://doi.org/10.1126/sciadv.abf1234.
Gauchat, Gordon. 2011.
“The Cultural Authority of Science: Public Trust and Acceptance of Organized Science.” Public Understanding of Science 20 (6): 751–70.
https://doi.org/10.1177/0963662510365246.
———. 2023.
“The Legitimacy of Science.” Annual Review of Sociology 49 (1): 263–79.
https://doi.org/10.1146/annurev-soc-030320-035037.
Gauchat, Gordon, and Kenneth T. Andrews. 2018.
“The Cultural-Cognitive Mapping of Scientific Professions.” American Sociological Review 83 (3): 567–95.
https://doi.org/10.1177/0003122418773353.
Gawronski, Bertram, Nyx L. Ng, and Dillon M. Luke. 2023.
“Truth Sensitivity and Partisan Bias in Responses to Misinformation.” Journal of Experimental Psychology: General 152 (8): 2205–36.
https://doi.org/10.1037/xge0001381.
Giddens, Anthony. 1991. Modernity and Self-Identity: Self and Society in the Late Modern Age. Stanford, Calif: Stanford University Press.
*Gottlieb, Jessica, Claire Adida, and Richard Moussa. n.d.
“Reducing Misinformation in a Polarized Context: Experimental Evidence from Côte d’ivoire.” https://doi.org/10.31219/osf.io/6x4wy.
*Guess, Andrew M., Michael Lerner, Benjamin Lyons, Jacob M. Montgomery, Brendan Nyhan, Jason Reifler, and Neelanjan Sircar. 2020.
“A Digital Media Literacy Intervention Increases Discernment Between Mainstream and False News in the United States and India.” Proceedings of the National Academy of Sciences 117 (27): 15536–45.
https://doi.org/10.1073/pnas.1920498117.
Habermas, Jürgen. 1989. Jürgen Habermas on society and politics: a reader. Edited by Steven Seidman. 1. [print.]. Sociology, political science, philosophy. Boston: Beacon Press.
*Hameleers, Michael, Marina Tulin, Claes De Vreese, Toril Aalberg, Peter Van Aelst, Ana Sofia Cardenal, Nicoleta Corbu, et al. 2023.
“Mistakenly Misinformed or Intentionally Deceived? Mis- and Disinformation Perceptions on the Russian War in Ukraine Among Citizens in 19 Countries.” European Journal of Political Research, December, 1475–6765.12646.
https://doi.org/10.1111/1475-6765.12646.
Hendriks, Friederike, Dorothe Kienhues, and Rainer Bromme. 2020.
“Replication Crisis = Trust Crisis? The Effect of Successful Vs Failed Replications on Laypeople’s Trust in Researchers and Research.” Public Understanding of Science 29 (3): 270–88.
https://doi.org/10.1177/0963662520902383.
*Hlatky, Roman. n.d.
“Unintended Consequences? Russian Disinformation and Public Opinion.” https://doi.org/10.31219/osf.io/85vmt.
Kerr, John R., and Marc S. Wilson. 2021.
“Right-Wing Authoritarianism and Social Dominance Orientation Predict Rejection of Science and Scientists.” Group Processes & Intergroup Relations 24 (4): 550–67.
https://doi.org/10.1177/1368430221992126.
*Koetke, Jonah, Karina Schumann, Tenelle Porter, and Ilse Smilo-Morgan. 2023.
“Fallibility Salience Increases Intellectual Humility: Implications for People’s Willingness to Investigate Political Misinformation.” Personality and Social Psychology Bulletin 49 (5): 806–20.
https://doi.org/10.1177/01461672221080979.
Koetke, Jonah, Karina Schumann, Shauna M. Bowes, and Nina Vaupotič. 2024.
“The Effect of Seeing Scientists as Intellectually Humble on Trust in Scientists and Their Research.” Nature Human Behaviour, November, 1–14.
https://doi.org/10.1038/s41562-024-02060-x.
*Kreps, Sarah E., and Douglas L. Kriner. 2023.
“Assessing Misinformation Recall and Accuracy Perceptions: Evidence from the COVID-19 Pandemic.” Harvard Kennedy School Misinformation Review, October.
https://doi.org/10.37016/mr-2020-123.
*Lee, Eun-Ju, and Jeong-woo Jang. 2023.
“How Political Identity and Misinformation Priming Affect Truth Judgments and Sharing Intention of Partisan News.” Digital Journalism 0 (0): 1–20.
https://doi.org/10.1080/21670811.2022.2163413.
*Lühring, Jula, Apeksha Shetty, Corinna Koschmieder, David Garcia, Annie Waldherr, and Hannah Metzler. n.d.
“Emotions in Misinformation Studies: Distinguishing Affective State from Emotional Response and Misinformation Recognition from Acceptance.” https://doi.org/10.31234/osf.io/udqms.
Luo, Mufan, Jeffrey T. Hancock, and David M. Markowitz. 2022.
“Credibility Perceptions and Detection Accuracy of Fake News Headlines on Social Media: Effects of Truth-Bias and Endorsement Cues.” Communication Research 49 (2): 171–95.
https://doi.org/10.1177/0093650220921321.
*Lutzke, Lauren, Caitlin Drummond, Paul Slovic, and Joseph Árvai. 2019.
“Priming Critical Thinking: Simple Interventions Limit the Influence of Fake News about Climate Change on Facebook.” Global Environmental Change 58 (September): 101964.
https://doi.org/10.1016/j.gloenvcha.2019.101964.
*Lyons, Benjamin, Andy J. King, and Kimberly Kaphingst. n.d.
“A Health Media Literacy Intervention Increases Skepticism of Both Inaccurate and Accurate Cancer News Among U.S. Adults.” https://doi.org/10.31219/osf.io/hm9ty.
*Lyons, Benjamin, Ariana Modirrousta-Galian, Sacha Altay, and Nikita Antonia Salovich. 2024.
“Reduce Blind Spots to Improve News Discernment? Performance Feedback Reduces Overconfidence but Does Not Improve Subsequent Discernment,” February.
https://doi.org/10.31219/osf.io/kgfrb.
*Lyons, Benjamin, Jacob Montgomery, and Jason Reifler. n.d.
“Partisanship and Older Americans’ Engagement with Dubious Political News.” https://doi.org/10.31219/osf.io/etb89.
*Maertens, Rakoen, Friedrich M. Götz, Hudson F. Golino, Jon Roozenbeek, Claudia R. Schneider, Yara Kyrychenko, John R. Kerr, et al. 2024.
“The Misinformation Susceptibility Test (MIST): A Psychometrically Validated Measure of News Veracity Discernment.” Behavior Research Methods 56 (3): 1863–99.
https://doi.org/10.3758/s13428-023-02124-2.
Mairal, Santos *Espina, Florencia Bustos, Guillermo Solovey, and Joaquín Navajas. 2023.
“Interactive Crowdsourcing to Fact-Check Politicians.” Journal of Experimental Psychology: Applied, August.
https://doi.org/10.1037/xap0000492.
*Martel, Cameron, Gordon Pennycook, and David G. Rand. 2020.
“Reliance on Emotion Promotes Belief in Fake News.” Cognitive Research: Principles and Implications 5 (1): 47.
https://doi.org/10.1186/s41235-020-00252-3.
Mercier, Hugo. 2017.
“How Gullible Are We? A Review of the Evidence from Psychology and Social Science.” Review of General Psychology 21 (2): 103–22.
https://doi.org/10.1037/gpr0000111.
———. 2020.
Not Born Yesterday: The Science of Who We Trust and What We Believe.
https://doi.org/10.1515/9780691198842.
*Modirrousta-Galian, Ariana, Philip A. Higham, and Tina Seabrooke. 2023.
“Effects of Inductive Learning and Gamification on News Veracity Discernment.” Journal of Experimental Psychology: Applied 29 (3): 599–619.
https://doi.org/10.1037/xap0000458.
———. 2024. “Wordless Wisdom: The Dominant Role of Tacit Knowledge in True and Fake News Discrimination.” Journal of Applied Research in Memory and Cognition.
*Muda, Rafał, Gordon Pennycook, Damian Hamerski, and Michał Białek. 2023.
“People Are Worse at Detecting Fake News in Their Foreign Language.” Journal of Experimental Psychology: Applied 29 (4): 712–24.
https://doi.org/10.1037/xap0000475.
National Academies of Sciences, Engineering. 2024. Understanding and Addressing Misinformation about Science.
National Academies of Sciences, Engineering, and Medicine. 2016.
Science Literacy: Concepts, Contexts, and Consequences. Edited by Catherine E. Snow and Kenne A. Dibner. Washington, D.C.: National Academies Press.
https://doi.org/10.17226/23595.
Nobles, Melissa, Chad Womack, Ambroise Wonkam, and Elizabeth Wathuti. 2022.
“Science Must Overcome Its Racist Legacy: Nature’s Guest Editors Speak.” Nature 606 (7913): 225–27.
https://doi.org/10.1038/d41586-022-01527-z.
Noy, Shiri, and Timothy L. O’Brien. 2019.
“Science for Good? The Effects of Education and National Context on Perceptions of Science.” Public Understanding of Science 28 (8): 897–916.
https://doi.org/10.1177/0963662519863575.
*OECD. 2022.
“An International Effort Using Behavioural Science to Tackle the Spread of Misinformation.” https://doi.org/10.1787/b7709d4f-en.
Oreskes, Naomi. 2019. Why trust science? The University Center for Human Values series. Princeton Oxford: Princeton University Press.
*Orosz, Gábor, Benedek Paskuj, Laura Faragó, and Péter Krekó. 2023.
“A Prosocial Fake News Intervention with Durable Effects.” Scientific Reports 13 (1): 3958.
https://doi.org/10.1038/s41598-023-30867-7.
*Pehlivanoglu, Didem, Tian Lin, Farha Deceus, Amber Heemskerk, Natalie C. Ebner, and Brian S. Cahill. 2021.
“The Role of Analytical Reasoning and Source Credibility on the Evaluation of Real and Fake Full-Length News Articles.” Cognitive Research: Principles and Implications 6 (1): 24.
https://doi.org/10.1186/s41235-021-00292-3.
*Pennycook, Gordon, Tyrone D. Cannon, and David G. Rand. 2018.
“Prior Exposure Increases Perceived Accuracy of Fake News.” Journal of Experimental Psychology: General 147 (12): 1865–80.
https://doi.org/10.1037/xge0000465.
*Pennycook, Gordon, Jonathon McPhetres, Yunhao Zhang, Jackson G. Lu, and David G. Rand. 2020.
“Fighting COVID-19 Misinformation on Social Media: Experimental Evidence for a Scalable Accuracy-Nudge Intervention.” Psychological Science 31 (7): 770–80.
https://doi.org/10.1177/0956797620939054.
*Pennycook, Gordon, and David G. Rand. 2020.
“Who Falls for Fake News? The Roles of Bullshit Receptivity, Overclaiming, Familiarity, and Analytic Thinking.” Journal of Personality 88 (2): 185–200.
https://doi.org/10.1111/jopy.12476.
Pennycook, Gordon, Jabin Binnendyk, Christie Newton, and David G. Rand. 2021.
“A Practical Guide to Doing Behavioral Research on Fake News and Misinformation.” Collabra: Psychology 7 (1): 25293.
https://doi.org/10.1525/collabra.25293.
Pennycook, Gordon, and David G. Rand. 2019.
“Lazy, Not Biased: Susceptibility to Partisan Fake News Is Better Explained by Lack of Reasoning Than by Motivated Reasoning.” Cognition 188 (July): 39–50.
https://doi.org/10.1016/j.cognition.2018.06.011.
*Pereira, Frederico Batista, Natália S. Bueno, Felipe Nunes, and Nara Pavão. 2023.
“Inoculation Reduces Misinformation: Experimental Evidence from Multidimensional Interventions in Brazil.” Journal of Experimental Political Science, July, 1–12.
https://doi.org/10.1017/XPS.2023.11.
Petersen, Michael Bang, Alexander Bor, Frederik Jørgensen, and Marie Fly Lindholt. 2021.
“Transparent Communication about Negative Features of COVID-19 Vaccines Decreases Acceptance but Increases Trust.” Proceedings of the National Academy of Sciences 118 (29): e2024597118.
https://doi.org/10.1073/pnas.2024597118.
*Rathje, Steve, Jon Roozenbeek, Jay J. Van Bavel, and Sander Van Der Linden. 2023.
“Accuracy and Social Motivations Shape Judgements of (Mis)information.” Nature Human Behaviour, March.
https://doi.org/10.1038/s41562-023-01540-w.
*Roozenbeek, Jon, Rakoen Maertens, Stefan M Herzog, Michael Geers, Ralf Kurvers, and Mubashir Sultan. 2022. “Susceptibility to Misinformation Is Consistent Across Question Framings and Response Modes and Better Explained by Myside Bias and Partisanship Than Analytical Thinking.” Judgment and Decision Making 17 (3): 27.
Roozenbeek, Jon, Claudia R. Schneider, Sarah Dryhurst, John Kerr, Alexandra L. J. Freeman, Gabriel Recchia, Anne Marthe van der Bles, and Sander van der Linden. 2020.
“Susceptibility to Misinformation about COVID-19 Around the World.” Royal Society Open Science 7 (10): 201199.
https://doi.org/10.1098/rsos.201199.
*Rosenzweig, Leah R., Bence Bago, Adam J. Berinsky, and David G. Rand. 2021.
“Happiness and Surprise Are Associated with Worse Truth Discernment of COVID-19 Headlines Among Social Media Users in Nigeria.” Harvard Kennedy School Misinformation Review, August.
https://doi.org/10.37016/mr-2020-75.
*Ross, Björn, Jennifer Heisel, Anna-Katharina Jung, and Stefan Stieglitz. 2018. “Fake News on Social Media: The (In)Effectiveness of Warning Messages.”
*Ross, Robert M, David G Rand, and Gordon Pennycook. 2021. “Beyond “Fake News”: Analytic Thinking and the Detection of False and Hyperpartisan News Headlines.” Judgment and Decision Making 16 (2): 22.
Scharff, Darcell P., Katherine J. Mathews, Pamela Jackson, Jonathan Hoffsuemmer, Emeobong Martin, and Dorothy Edwards. 2010.
“More Than Tuskegee: Understanding Mistrust about Research Participation.” Journal of Health Care for the Poor and Underserved 21 (3): 879–97.
https://doi.org/10.1353/hpu.0.0323.
Shirikov, Anton. 2024.
“Fake News for All: How Citizens Discern Disinformation in Autocracies.” Political Communication 41 (1): 4565.
https://doi.org/10.1080/10584609.2023.2257618.
*Smelter, Thomas J., and Dustin P. Calvillo. 2020.
“Pictures and Repeated Exposure Increase Perceived Accuracy of News Headlines.” Applied Cognitive Psychology 34 (5): 1061–71.
https://doi.org/10.1002/acp.3684.
Song, Hyunjin, David M Markowitz, and Samuel Hardman Taylor. 2022.
“Trusting on the Shoulders of Open Giants? Open Science Increases Trust in Science for the Public and Academics.” Journal of Communication 72 (4): 497–510.
https://doi.org/10.1093/joc/jqac017.
*Stagnaro, Michael, Sophia Pink, David G. Rand, and Robb Willer. 2023.
“Increasing Accuracy Motivations Using Moral Reframing Does Not Reduce Republicans’ Belief in False News.” Harvard Kennedy School Misinformation Review, November.
https://doi.org/10.37016/mr-2020-128.
*Sultan, Mubashir, Alan N. Tump, Michael Geers, Philipp Lorenz-Spreen, Stefan M. Herzog, and Ralf H. J. M. Kurvers. 2022.
“Time Pressure Reduces Misinformation Discrimination Ability but Does Not Alter Response Bias.” Scientific Reports 12 (1): 22416.
https://doi.org/10.1038/s41598-022-26209-8.
Wellcome Global Monitor. 2018.
“Wellcome Global Monitor 2018.” https://wellcome.org/reports/wellcome-global-monitor/2018.
*Winter, Stephan, Sebastián Valenzuela, Marcelo Luis Barbosa Santos, Tobias Schreyer, Lena Iwertowski, and Tobias Rothmund. n.d.
“(Don’t) Stop Believing: A Signal Detection Approach to Risk and Protective Factors for Engagement with Politicized (Mis)information in Social Media.” https://doi.org/10.31234/osf.io/84c36.
Zaller, John. 1992. The Nature and Origins of Mass Opinion. Cambridge university press.