Theoretical Articles

The Explanation-Polarisation Model: Pseudoscience Spreads Through Explanatory Satisfaction and Group Polarisation

Angelo Fasce*1

Journal of Social and Political Psychology, 2022, Vol. 10(2), 693–705, https://doi.org/10.5964/jspp.8051

Received: 2021-12-30. Accepted: 2022-05-16. Published (VoR): 2022-12-14.

Handling Editor: Klaus Michael Reininger, University Medical Center Hamburg-Eppendorf, Hamburg, Germany

*Corresponding author at: R. Larga 2, 3000-370 – Coimbra, Portugal. E-mail: afc@fmed.uc.pt

This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

This article presents an integrative model for the endorsement of pseudoscience: the explanation-polarisation model. It is based on a combination of perceived explanatory satisfaction and group polarisation, offering a perspective different from the classical confusion-based conception, in which pseudoscientific beliefs would be accepted through a lack of distinction between science and science mimicry. First, I discuss the confusion-based account in the light of current evidence, pointing out some of its explanatory shortcomings. Second, I develop the explanation-polarisation model, showing its explanatory power in connection with recent research outcomes in cognitive and social psychology.

Keywords: pseudoscience, group polarisation, explanatory satisfaction, motivated reasoning

Discussions of pseudoscience often offer a multidisciplinary analysis of the long history of popular beliefs that challenge the scientific mainstream, highlighting their harmful implications and enduring popularity. Pseudoscience is usually defined as cognitions involving radical epistemic misconduct and science mimicry (e.g., Fasce, 2020; Hansson, 2009), and scholars have held the position that the adoption of pseudoscientific beliefs is mostly due to a widespread confusion between science and claims that mimic science. I will refer to this position as the “confusion-based conception”. Though popular, in this article I will argue that this confusion-based account has limited explanatory power, as it does not fit the best current psychological evidence. In contrast, I will develop an alternative account of the endorsement of pseudoscience—the Explanation-Polarisation model (EX-PO). This model aims at explaining the endorsement of pseudoscientific conceptions on the basis of a combination of their perceived explanatory appeal and their capacity to foster group polarisation, thus being based on a large corpus of cognitive and social psychological evidence on explanatory satisfaction and intergroup dynamics. It incorporates several psychological phenomena elicited by pseudoscience, explaining the function of science mimicry in psychological terms.

Are Pseudoscientific Beliefs Confusion-Based?

The confusion-based conception of the endorsement of pseudoscience asserts that the appeal of pseudoscience is mostly explained by the existence of a rhetorical pathway aimed at exploiting superficial scientific literacy. Hence, paradoxically, pseudoscience would thrive in environments in which science is regarded as an epistemic authority, in that pseudoscientific doctrines would conceal lay cognition and cognitive biases in the trappings of science (Allchin, 2012). In this section, I will argue that methodological shortcomings, a lack of specificity, and contradictions with experimental results argue against the accuracy of the specific predictions of this confusion-based conception.

Some examples of this conception can be extracted from an influential multi-author book on the topic, Pigliucci and Boudry (2013), in which several prominent philosophers of pseudoscience emphasize an exploitation of the epistemic authority of science when explaining the psychological function of science mimicry:

  • “Pseudoscience can cause so much trouble in part because the public does not appreciate the difference between real science and something that masquerades as science. (…) Pseudoscience thrives because we have not fully come to grips yet with the cognitive, sociological, and epistemological roots of this phenomenon” (pp. 3–4).

  • “Pseudosciences piggyback on the authority science has been endowed with in modern society. The question remains as to why it is so important for pseudosciences to seek that authority, and why they often succeed in attaining it” (p. 373).

  • “Pseudoscientific beliefs are usually dressed up in scientific garb. This does not substantially alter how they interact with human cognitive systems, however. All that it may do is render pseudoscientific beliefs somewhat more attractive in the context of modern cultures that hold scientific knowledge in great regard but have limited actual understanding of it” (p. 392).

  • “Pseudoscientists seek to be taken seriously for the same reason that scientists claim our attention, that the propositions of a rigorous and rational science are more worthy of belief than the common run of opinion” (p. 417).

Some research outcomes on individual differences indicate that people have difficulties distinguishing between science and pseudoscience (Gaze, 2014; Lyddy & Hughes, 2012)1. Nevertheless, the trappings of science do not explain these difficulties, as the same lack of discernment has also been found in relation to non-science-mimicking paranormal beliefs (Brewer, 2013; Garrett & Cutting, 2017) and astrology, a borderline doctrine between pseudoscientific and paranormal rhetoric (DeRobertis & Delaney, 2000; Sugarman, Impey, Buxner, & Antonellis, 2011). The negative association between well-oriented scientific literacy and pseudoscientific beliefs, as predicted by the confusion-based conception, is similarly not well-supported. Even though one study has found negative correlations between knowledge of scientific facts and pseudoscientific beliefs (Fasce & Picó, 2019a), again, the effect is not exclusive to pseudoscience as it has been found also for non-mimicking paranormal beliefs (Aarnio & Lindeman, 2005; Fasce & Picó, 2019a; Vilela & Álvarez, 2004). Moreover, this association seems to be mediated by trust in science, which shows the same negative positive correlation with paranormal beliefs and conspiracy theories (Fasce & Picó, 2019a; Irwin, Dagnall, & Drinkwater, 2016)2.

In a recent experimental study, researchers concluded that courses that directly promoted a motivational state of distrust in pseudoscience produced a reduction of those beliefs, whereas general education classes on critical thinking and research methods did not (Dyer & Hall, 2019), and additional experimental pre-test/post-test studies also suggest this mediated relationship (Franz & Green, 2013; Morier & Keeports, 1994; Wilson, 2018). Accordingly, contrary to the idea of a backfire effect caused by misguided trust in science, low confidence in science and disregard for the values of scientific inquiry constitute good predictors for the endorsement of pseudoscience (Lewandowsky & Oberauer, 2021; Omer, Salmon, Orenstein, deHart, & Halsey, 2009).

The confusion-based account also exhibits limitations regarding experimental results on the role of scientists as judgmental shortcuts. Sloman and Rabb (2016) conducted a series of experiments to test how people behave under conditions of division of cognitive labour and epistemic dependence. Their results show that knowing that scientists understand a phenomenon gives individuals the sense that they understand it better themselves, but only when they have ostensible access to scientists' explanations and accept them. So, these individuals were not blinded by scientists' discourses and aesthetics, instead, they tended to use experts’ information as an echo chamber for their subjective assessment of scientific contents.

Brewer (2013) evaluated the effect of three versions of a news story about paranormal investigators: one in terms of traditional supernaturalism, a second with a pseudoscientific rationale, and a third presenting a discrediting scientific critique. Although a pseudoscientific rationale increased the parapsychologists' perceived credibility, it had no significant effect on the endorsement of paranormal beliefs — Garrett and Cutting (2017) conducted a similar experiment, replicating the previously observed lack of differences between the three versions regarding the perceived believability of the paranormal story. Likewise, other studies support that, although science mimicry tend to increase sources’ credibility, it does not promote change in beliefs (Bromme, Scharrer, Stadtler, Hömberg, & Torspecken, 2015; Knobloch-Westerwick, Johnson, Silver, & Westerwick, 2015; Thomm & Bromme, 2012; Zaboski & Therriault, 2020), as the effect of scientific jargon is mediated by its adjustment to previous beliefs (Scurich & Shniderman, 2014) and is not persuasive by itself (Gruber & Dickerson, 2012; Hook & Farah, 2013; Michael, Newman, Vuorre, Cumming, & Garry, 2013).

An analysis of the controversy concerning so-called “neuromania” (Legrenzi & Umiltà, 2011) helps to better understand these results. Neuroscientific research is particularly fascinating to the general public, so Weisberg, Keil, Goodstein, Rawson, and Gray (2008) conducted an experiment on the seductive allure of explanations with irrelevant neuroscientific information. As expected, irrelevant information with neuroscientific jargon were preferred by the participants, regardless of the quality of the underlying logic of the explanation. Other experiments have found the same effect (Fernandez-Duque, Evans, Christian, & Hodges, 2015; Weisberg, Taylor, & Hopkins, 2015), also using neuroimaging (McCabe & Castel, 2008)—although the neuroimaging studies have low reproducibility rates (Schweitzer, Baker, & Risko, 2013). Pseudoscientific neuro-jargon is particularly effective for psychological explanations in comparison with irrelevant social science and natural science jargon (Fernandez-Duque et al., 2015; Weisberg, Hopkins, & Taylor, 2018), perhaps due to the authority ascribed to neuroscience when explaining behaviour (Racine, Waldman, Rosenberg, & Illes, 2010).

Nevertheless, Tabacchi and Cardaci (2016) discovered that the allure of neuroscientific jargon is mediated by the wording of the question. All previous experiments asked participants how “satisfactory” they considered the explanations by using a 7-point Likert scale, which is an aesthetic judgment, while in Tabacchi and Cardaci (2016) participants had to choose the correct explanation from two alternatives by using a dichotomous measure about its truthfulness. In this psychometric context, the allure of explanations with vacuous pseudoscientific jargon was not observed, and, as no additional information was given to the participants, it is not likely that pseudoscientific jargon fooled their trust in science in prior experiments. The allure of scientific jargon seems to depend on how individuals are asked to assess evidence in different motivational contexts—i.e., focusing on how satisfactory scientific jargon is in psychological terms or on how believable it is in epistemic terms.

In sum, current evidence consistently suggests that uncritical acceptance of pseudoscientific information is mediated by perceived explanatory satisfaction and adjustment to previous beliefs, not by misguided trust in science. Nevertheless, the criticism expressed in this section regarding the confusion-based approach does not fully invalidate it, as confusion could still be a relevant variable within specific groups and contexts. More research is needed to know if both models can be complementary rather than antithetical.

The Explanation-Polarisation Model

In this section, I will now develop an explanatory framework for the endorsement of pseudoscience detached from confusion-based conceptions. EX-PO departs from the usual definition of pseudoscience, which is based on science mimicry, but it does not explain the endorsement of pseudoscience by means of a faulty distinction between science and pseudoscience, thus conferring another role to science mimicry3. The EX-PO model takes pseudoscience as a set of flawed but appealing explanations, adding a relational aspect by including psychological phenomena related to group polarisation.

Rekker (2021) has proposed a relevant distinction between psychological and ideological science rejection. On one hand, psychological rejection of science takes place implicitly and arises from individuals’ tendency to favor information that maintains their status in an affinity group. On the other hand, ideological rejection (religious, political, etc.) consists of explicit contestation of science through arguments derived from complex doctrines—for example, climate change countermovement organizations (McKie, 2019)4. EX-PO constitutes a model for psychological rejection of science, as the main unit of analysis of the model is the interaction between the individual and the pseudoscientific doctrine, considering both the psychological predispositions and the rhetorical devices involved in such interaction. In this regard, EX-PO explains the endorsement of pseudoscience through the supply of psychologically satisfactory explanations and the demand for profitable ideas that conform to rewarding social norms. There is a general tendency in all individuals to favour mechanisms and categorizations, and individuals also tend to hold desirable, concerning, and useful beliefs. The rhetorical devices of pseudoscience adapt to this psychological framework to gain support.

EX-PO constitutes an explanatory framework for the spread of the two major forms of pseudoscience, pseudo-theory promotion and science denial (Fasce & Picó, 2019b; Hansson, 2017), although both forms show their own characteristics5. The explanatory satisfaction offered by pseudo-theory promotion, due to its greater doctrinal content, should be higher than that of science denialism, whose endorsement, in turn, would be more influenced by ideology-driven group polarisation and direct confrontation with scientific information (Lewandowsky, Pilditch, Madsen, Oreskes, & Risbey, 2019; Medimorec & Pennycook, 2015). For example, as climate change denialists cannot offer satisfactory alternative explanations, they need to simulate coherence by conspiracist discourse reinforced by group behaviour (Lewandowsky, Cook, & Lloyd, 2018).

The Explanatory Allure of Pseudoscience: Empty Labels and Illusory Mechanisms

I will start by elucidating the “explanation” aspect of EX-PO. As already mentioned, interpersonal disagreement about facts is highly influenced by the explanatory allure of pseudoscience, as well as by individual cognitive predispositions toward anti-scientific beliefs such as overreliance in intuitive thinking (Pennycook, Cheyne, Seli, Koehler, & Fugelsang, 2012), ontological confusions6 (Lindeman, Svedholm-Häkkinen, & Lipsanen, 2015), pseudo-profound bullshit receptivity (Pennycook, Cheyne, Barr, Koehler, & Fugelsang, 2015), and causal illusions (Torres, Barberia, & Rodríguez-Ferreiro, 2020).

Explanations matter since they feel intrinsically valuable for pragmatic concerns such as prediction and control (Lombrozo, 2011). Nevertheless, criteria identified in academy publications as explanatory virtues hardly predict positive explanation assessment in naturalistic settings (Lombrozo, 2016; Zemla, Sloman, Bechlivanidis, & Lagnado, 2017). For example, although scholars often consider abstract and simple explanations to be preferable, people tend to favor less generalisable explanations (Bechlivanidis, Lagnado, Zemla, & Sloman, 2017; Khemlani, Sussman, & Oppenheimer, 2011) and to explain inconsistencies by positing additional causes rather than disputing premises, thus preferring explanations that involve complex causal structures (Khemlani & Johnson-Laird, 2011). In this regard, pseudoscience would exploit several sources of subjective explanatory satisfaction, such as flawed categorisations and mechanistic explanations7.

Categorical language supports particularly strong inferences, leading people to form representations in more essentialist (Gelman, Ware, & Kleinberg, 2010), categorical (Lupyan, 2012), and prototypical terms (Lupyan, 2017). People learn named categories more quickly (Lupyan, Rakison, & McClelland, 2007), are more likely to agree with categorical statements about causes and features (Ahn, Taylor, Kato, Marsh, & Bloom, 2013; Hemmatian & Sloman, 2018), and explanations that include sharp and easily recognisable labels for categories are significantly more satisfying in psychological terms (Giffin, Wilkenfeld, & Lombrozo, 2017).

There are numerous examples of insubstantial pseudoscientific labels, such as “cell memory”, “energetic blockage”, “vertebral subluxation”, “detoxification”, “qi deficiency”, and “meta-model”. Parapsychology is particularly interesting in this regard as it expands the categories of folk paranormal beliefs: where a folk paranormal believer experiences a “ghost”, a parapsychologist may see a “poltergeist”, an “apparitional experience”, an “ectoplasm”, a “psychophony”, an “orb”, etc. Of course, scientists also use a large number of complex categories, but these conceptual networks are typically guided by evidence (i.e., scientists tend to reject unfounded categories) and parsimony (i.e., scientists tend to reject unnecessary categories). So, categorizations are not problematic per se, but rather that both scientific and unscientific categorizations have an appeal of easy applicability on the level of the individual recipient.

As outlined above, mechanistic explanations also have a relevant role within EX-PO. People have a strong preference for explanations that invoke causal mechanisms, perhaps driven by a desire to identify as many causes of an effect as possible (Mills, Sands, Rowles, & Campbell, 2019; Zemla et al., 2017). As with categories, mechanistic explanations are also widely used among scientists, although mechanisms in pseudoscience have been already refuted or are construed in a way that makes them untestable. Pseudoscientific doctrines are plenty of flawed mechanistic explanations and processes, especially in comparison with other forms of unwarranted beliefs, such as paranormal and conspiracy theories. For example, the five “biological laws” that rule the emotional aetiology of disease in German new medicine, kinesiology's viscerosomatic relationship, improved blood flow and oxygenation in tissues by magnetic stimulation, memories and experiences passing down through generations by DNA or morphic resonance, and homeopathic dynamization by dilution and succussion.

There is research supporting that the satisfying effect of neuroscientific explanations, discussed in the previous section, is due to the perception of mechanistic explanations. Rhodes, Rodríguez, and Shah (2014) found that neuroscientific information boost self-assessed understanding of the mechanisms by providing perceived insight about causal chains and categorisations about psychological phenomena. Hopkins, Weisberg, and Taylor (2016) confirmed these findings, expanding the allure of mechanistic information to other fields, such as physics, chemistry, and biology. Flawed mechanistic explanations constitute a confirmed cause of illusion of explanatory depth (IOED; Rozenblit & Keil, 2002), which occurs when people believe they understand a process more deeply than they actually do. This overconfidence effect has been empirically associated with other kinds of unwarranted beliefs, such as conspiracy theories (Vitriol & Marsh, 2018), and pseudoscientific rhetoric may be particularly effective in generating this effect among its supporters—indeed, Scharrer, Rupieper, Stadtler, and Bromme (2017) suggest that even legitimate, although oversimplified, science communication causes overconfidence effects among laypeople.

IOED has been identified only in knowledge areas that involve complex theorization about causal networks, such as biological or physical processes (e.g., how a zipper, a toaster, tides, or the digestive system works), so it does not take place across other types of knowledge, such as declarative or narrative (e.g., how to make cookies or capital cities). Several factors converge to create a strong illusion of depth for mechanistic explanations. First, individuals have less experience in representing, expressing, and testing their explanations in comparison with other kinds of knowledge (Rozenblit & Keil, 2002; Wilson & Keil, 1998). Second, because people usually rely on others’ understanding when assessing mechanistic explanations, they tend to overestimate their own understanding of mechanisms in relation to others’ understanding (Fisher, Goddu, & Keil, 2015). Third, explanations are layered, so there are always higher and lower levels of analysis and people often confuse their superficial insights with a deeper understanding of how mechanisms work (Alter, Oppenheimer, & Zemla, 2010). As a result, IOED is based on a failure to construct accurate mental representations by using an appropriate level of construal, thus confusing the metacognitive experience of understanding with the capacity to offer a proper explanation.

Despite IOED is usually overridden by debunking information (Kowalski & Taylor, 2017) and iterative failure (Mills & Keil, 2004), there are three variables described in the current literature which explain why this effect can be so persistent:

Accordingly, a successful pseudoscience should be concerning, useful in the short-term, and socially desirable for its supporters. How pseudoscientific doctrines generate this specific motivational state is accounted by EX-PO's “polarisation” aspect.

The Social Dimension of Pseudoscience: Ingroup Polarisation and Intergroup Clash of Cognitions

Recent research has highlighted how today’s societies are fractured by partisan identities and feedback loops of accommodating information, identity narratives, and anti-expertise (Kreiss, 2019; Lewandowsky, Ecker, & Cook, 2017; Pariser, 2011). For example, previous results have shown that the information shared in social media is more radical and partisan than that of open websites (Faris et al., 2017), thus facilitating the spread of appealing falsehoods (Vosoughi, Roy, & Aral, 2018). Due to its high prevalence in the public sphere and its alarming implications, the struggle between groups of pseudoscientific believers and critical thinkers is increasingly polarised, establishing an intergroup relationship dominated by an imaginary of distrust, competition, and mutual derogation (e.g., Cano-Orón, 2019).

Polarisation outbreaks occur when the empirical matter in dispute is crucial to define a social identity (Fasce, Adrián-Ventura, Lewandowsky, & van der Linden, 2021; Sloman & Fernbach, 2017). Under these conditions, intergroup threats foster self-uncertainty and self-uncertainty motivates radical identification with groups that provide distinctive normative beliefs and higher identification-contingent uncertainty reduction (Hogg, Meehan, & Farquharson, 2010; Hogg & Wagoner, 2017). As a consequence, self-uncertain people with group belongingness tend to increase their endorsement of anti-scientific beliefs (van Prooijen, 2016; van Prooijen & Jostmann, 2013), and the sense of community and perceived existential threats are well-documented root factors for pseudoscientific and conspiracy theories (Fasce, Adrián-Ventura, et al., 2021; Franks, Bangerter, Bauer, Hall, & Noort, 2017; van Prooijen, 2020). Pseudoscientific beliefs have been proven to be influenced by tendency to accept beliefs on the basis of short-term interpersonal benefits (Fasce, Adrián-Ventura, & Avendaño, 2020), which lead individuals to assess scientific evidence depending on whether or not scientists agree with their identity and attitudes (Bromme et al., 2015; Giese, Neth, Moussaïd, Betsch, & Gaissmaier, 2020; Kahan, Jenkins‐Smith, & Braman, 2011; Knobloch-Westerwick et al., 2015; Scurich & Shniderman, 2014).

Recent results highlight how social groups constructed around pseudoscientific beliefs are often exploited by populistic and authoritarian political movements exhibiting radical ideologies (Fasce et al., 2020; Mede & Schäfer, 2020)8. These forms of anti-scientific politics exploit the tension between short-term ingroup rationality and long-term truth-seeking cognition, in which evidence-recognition is crucial, making individuals exposed to partisan information more polarised than nonexposed ones (Druckman & Bolsen, 2011; Nyhan & Reifler, 2015; Nyhan, Reifler, Richey, & Freed, 2014; Palm, Lewis, & Feng, 2017). As a result, politically charged pseudoscientific doctrines such as climate change denial or the anti-vaccination movement are hardly overridden by literacy, analytical thinking, and information (Kahan, 2013; Kahan et al., 2012; Lewandowsky & Oberauer, 2016; Nyhan & Reifler, 2015; Nyhan et al., 2014).

These ideology-driven phenomena have been explained by the presence of motivated reasoning (Kahan, 2016), although recent research has questioned the validity of this interpretation by highlighting potential confounding factors and the elusive nature of backfire effect (Druckman & McGrath, 2019; Tappin, Pennycook, & Rand, 2021; Wood & Porter, 2019). This legitimate dispute regarding the underlying mechanism of group polarization over scientific information does not undermine the fact that group polarisation and perceived social consensus play a pivotal role in the spread of pseudoscience (Bromme et al., 2015; Giese et al., 2020; Kahan et al., 2011; Knobloch-Westerwick et al., 2015; Lewandowsky, Cook, Fay, & Gignac, 2019; Lewandowsky, Gignac, & Vaughan, 2013; Thomm & Bromme, 2012; van der Linden, Leiserowitz, & Maibach, 2018). Accordingly, EX-PO is compatible with any mechanism of group polarisation, including identity-related motivated reasoning, misperception of scientific consensus, and Bayesian belief update (e.g., Cook & Lewandowsky, 2016).

Besides polarisation over already internalised pseudoscientific beliefs, ingroup pressure may increase the motivation to adopt beliefs related to prior beliefs by family resemblance. Individuals who endorse one conspiracy theory tend to endorse new ones, even if they are completely fictitious (Swami et al., 2011) and mutually contradictory (Wood, Douglas, & Sutton, 2012), and the same pattern has been observed regarding pseudoscience (Lewandowsky, Cook, & Lloyd, 2018). This can be seen also from the high internal consistencies in psychometric scales: α = 0.92 for paranormal beliefs (Tobacyk, 2004), α = 0.93 for conspiracy beliefs (Brotherton, French, & Pickering, 2013), and α = 0.92 for pseudoscientific beliefs (Fasce, Avendaño, & Adrián-Ventura, 2021).

As rejection of a pseudoscience is not the same as accepting the relevant science, critical thinking should not be defined as a lack of EX-PO. Critical thinkers also engage in social dynamics that strengthen their conceptions and have specific, but moderated, sources of bias. For example, even though critical thinking is positively correlated with analytical reasoning (Svedholm-Häkkinen & Lindeman, 2013), Pennycook et al. (2012) also found that around 40% of critical thinkers can be characterised as non-analytical reasoners. Moreover, these non-analytical critical thinkers are prone to endorse the kind of ontological confusions that predispose people toward paranormal beliefs (Lindeman, Svedholm-Häkkinen, & Riekki, 2016). These results are in line with Norenzayan and Gervais (2013): critical thinking may originate not only from individual cognitive styles, but also from a lack of cultural input.

Potential Interventions

Although both conceptions, the confusion-based and EX-PO, are not contradictory, their assumptions and implications are not equivalent, particularly regarding how to face the problem of pseudoscience. Current prevention strategies deployed by organisations of critical thinkers are often based on the assumptions of the confusion-based interpretation, so they are focused on general dissemination aimed at improving scientific literacy. In view of the current upsurge of anti-scientific groups and politics—e.g., anti-scientific conspiracy ideation around COVID-19—, this has proven to be insufficient. EX-PO does not emphasize direct science dissemination, suggesting that the problem of pseudoscience might be better addressed through social interventions intended to reduce the allure of subjective explanatory satisfaction and group polarisation.

Accordingly, efficient interventions to reduce the endorsement of pseudoscience should take into account comprehensive motivational strategies such as inoculation messages exposing misleading argumentation techniques (Lewandowsky & van der Linden, 2021), as well as worldview and values affirmation (Lewandowsky, Ecker, Seifert, Schwarz, & Cook, 2012). This is important in order to boost perception of consensus cues and decrease the false consensus effect between experts and the public opinion. Another potential intervention should be focused on echo chambers—an interesting framework to deploy mechanisms of containment against these detrimental information architectures can be found in Lewandowsky et al. (2017). Even though the degree of ideological segregation in social media usage should not be overestimated—echo chambers and filter bubbles are not larger online than offline (Barberá, Jost, Nagler, Tucker, & Bonneau, 2015; Dubois & Blank, 2018)—, their influence on individuals’ receptivity toward attitude-consistent misinformation is supported by evidence (Giese et al., 2020).

Concluding Remarks

To facilitate future research on EX-PO, I finalise this article offering a summary of the model. Unwarranted beliefs increase their explanatory appeal by science mimicry, on the basis of spurious categorisations and flawed mechanistic explanations. This would be the basic role of the trappings of science. Therefore, the appeal of pseudoscience to isolated individuals comes through two pathways: explanatory satisfaction and more general individual cognitive predispositions. Group polarisation takes place after the aggregation and organisation of believers around pseudoscientific doctrines, and this process reinforces and promotes pseudoscientific beliefs through two pathways: reinforcement of already internalised pseudoscientific beliefs and ingroup pressure to accept new pseudoscientific beliefs resembling those already internalised. Group polarisation boost the explanatory allure of pseudoscience by helping pseudoscientific doctrines be perceived as concerning, useful in ingroup terms, and socially desirable.

Notes

1) An apparent exception is Bensley, Lilienfeld, and Powell (2014), however, this sample was entirely composed of psychology students and the questionnaire asked about some easily recognisable fields directly related to their field of study, such as psychology itself, cognitive behavioural therapy, neuroscience, phrenology, and parapsychology.

2) There are reasons to think that lack of generalised interpersonal trust may be a confounding variable that explains the association of conspiracy theories.

3) There is no contradiction between a definition of pseudoscience which includes science mimicry as a necessary characteristic, and EX-PO. The definition of pseudoscience is a philosophical issue, whereas the spread of pseudoscientific beliefs is a socio-psychological matter. Although it is widely accepted that pseudoscience is defined by science mimicry, EX-PO addresses its function. The exploitation of the authority of science is not integral to the conventional definition of pseudoscience: these are mutually independent issues.

4) There is mutual feedback between both types of science rejection: ideologues generate explicit arguments that exploit already existing group identities, whereas group identity constrains ideology’s persuasiveness by determining individuals’ receptivity.

5) “Pseudo-theory promotion” refers to prototypical pseudosciences, which are primarily based on the promotion of a complicated doctrine—e.g., morphic fields, German new medicine, cellular memories, and chiropractic. In contrast, science deniers deploy their rhetoric devices at the level of controversies, casting doubt on well-established scientific theories—for example, denial of climate change, GMOs, and vaccination.

6) An ontological confusion is the attribution of a specific feature of some stratum of reality, such as the psychological, to an entity belonging to a different stratum, such as the physical. For example, “moon aims to move forward”.

7) Further psychological research could integrate other strategies within EX-PO, such as self-validating belief systems (Boudry & Braeckman, 2012), ad-hoc reasoning (Boudry, 2013), or less aggressive communication styles (König & Jucks, 2019).

8) Despite the fact that the unit of analysis of EX-PO is located at the individual level and, therefore, the model does not directly account for political or sociological variables, macrosociological polarisation can be considered a proxy for perceptions and feelings of polarisation on the individual level.

Funding

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 964728 (JITSUVAX).

Acknowledgments

The author has no additional (i.e., non-financial) support to report.

Competing Interests

The author has declared that no competing interests exist.

References

  • Aarnio, K., & Lindeman, M. (2005). Paranormal beliefs, education, and thinking styles. Personality and Individual Differences, 39(7), 1227-1236. https://doi.org/10.1016/j.paid.2005.04.009

  • Ahn, W.-k., Taylor, E., Kato, D., Marsh, J., & Bloom, P. (2013). Causal essentialism in kinds. Quarterly Journal of Experimental Psychology, 66(6), 1113-1130. https://doi.org/10.1080/17470218.2012.730533

  • Allchin, D. (2012). Science con-artists. The American Biology Teacher, 74(9), 661-666. https://doi.org/10.1525/abt.2012.74.9.13

  • Alter, A., Oppenheimer, D., & Zemla, J. (2010). Missing the trees for the forest: A construal level account of the illusion of explanatory depth. Journal of Personality and Social Psychology, 99(3), 436-451. https://doi.org/10.1037/a0020218

  • Barberá, P., Jost, J., Nagler, J., Tucker, J., & Bonneau, R. (2015). Tweeting from left to right: Is online political communication more than an echo chamber? Psychological Science, 26(10), 1531-1542. https://doi.org/10.1177/0956797615594620

  • Bechlivanidis, C., Lagnado, D., Zemla, J., & Sloman, S. (2017). Concreteness and abstraction in everyday explanation. Psychonomic Bulletin & Review, 24(5), 1451-1464. https://doi.org/10.3758/s13423-017-1299-3

  • Bensley, D., Lilienfeld, S., & Powell, L. (2014). A new measure of psychological misconceptions: Relations with academic background, critical thinking, and acceptance of paranormal and pseudoscientific claims. Learning and Individual Differences, 36(7), 9-18. https://doi.org/10.1016/j.lindif.2014.07.009

  • Boudry, M. (2013). The hypothesis that saves the day: Ad hoc reasoning in pseudoscience. The Log Analyst, 56(223), 245-258.

  • Boudry, M., & Braeckman, J. (2012). How convenient! The epistemic rationale of self-validating belief systems. Philosophical Psychology, 25(3), 341-364. https://doi.org/10.1080/09515089.2011.579420

  • Brewer, P. (2013). The trappings of science: Media messages, scientific authority, and beliefs about paranormal investigators. Science Communication, 35(3), 311-333. https://doi.org/10.1177/1075547012454599

  • Bromme, R., Scharrer, L., Stadtler, M., Hömberg, J., & Torspecken, R. (2015). Is it believable when it’s scientific? How scientific discourse style influences laypeople’s resolution of conflicts. Journal of Research in Science Teaching, 52(1), 36-57. https://doi.org/10.1002/tea.21172

  • Brotherton, R., French, C., & Pickering, A. (2013). Measuring belief in conspiracy theories: The generic conspiracist beliefs scale. Frontiers in Psychology, 4, Article 279. https://doi.org/10.3389/fpsyg.2013.00279

  • Cano-Orón, L. (2019). A Twitter campaign against pseudoscience: The sceptical discourse on complementary therapies in Spain. Public Understanding of Science, 28(6), 679-695. https://doi.org/10.1177/0963662519853228

  • Cook, J., & Lewandowsky, S. (2016). Rational irrationality: Modeling climate change belief polarization using Bayesian networks. Topics in Cognitive Science, 8(1), 160-179. https://doi.org/10.1111/tops.12186

  • DeRobertis, M., & Delaney, P. (2000). A second survey of the attitudes of university students to Astrology and Astronomy. The Journal of the Royal Astronomical Society of Canada, 94, 112-122.

  • Druckman, J. N., & Bolsen, T. (2011). Framing, motivated reasoning, and opinions about emergent technologies. Journal of Communication, 61(4), 659-688. https://doi.org/10.1111/j.1460-2466.2011.01562.x

  • Druckman, J. N., & McGrath, M. (2019). The evidence for motivated reasoning in climate change preference formation. Nature Climate Change, 9, 111-119. https://doi.org/10.1038/s41558-018-0360-1

  • Dubois, E., & Blank, G. (2018). The echo chamber is overstated: The moderating effect of political interest and diverse media. Information Communication and Society, 21(5), 729-745. https://doi.org/10.1080/1369118X.2018.1428656

  • Dyer, K., & Hall, R. (2019). Effect of critical thinking education on epistemically unwarranted beliefs in college students. Research in Higher Education, 60(3), 293-314. https://doi.org/10.1007/s11162-018-9513-3

  • Faris, R., Roberts, H., Etling, B., Bourassa, N., Zuckerman, E., & Benkler, Y. (2017). Partisanship, propaganda, and disinformation: Online media and the 2016 U.S. Presidential Election (Berkman Klein Center for Internet & Society Research Paper). http://nrs.harvard.edu/urn-3:HUL.InstRepos:33759251

  • Fasce, A. (2019). Are pseudosciences like seagulls? A discriminant metacriterion facilitates the solution of the demarcation problem. International Studies in the Philosophy of Science, 32(3-4), 155-175. https://doi.org/10.1080/02698595.2020.1767891

  • Fasce, A., Adrián-Ventura, J., & Avendaño, D. (2020). Do as the Romans do: On the authoritarian roots of pseudoscience. Public Understanding of Science, 29(6), 597-613. https://doi.org/10.1177/0963662520935078

  • Fasce, A., Adrián-Ventura, J., Lewandowsky, S., & van der Linden, S. (2021). Science through a tribal lens: A group-based account of polarization over scientific facts. Group Processes & Intergroup Relations. Advance online publication. https://doi.org/10.1177/13684302211050323

  • Fasce, A., & Picó, A. (2019a). Science as a vaccine: The relation between scientific literacy and unwarranted beliefs. Science & Education, 28(1-2), 109-125. https://doi.org/10.1007/s11191-018-00022-0

  • Fasce, A., & Picó, A. (2019b). Conceptual foundations and validation of the Pseudoscientific Belief Scale. Applied Cognitive Psychology, 33(4), 617-628. https://doi.org/10.1002/acp.3501

  • Fasce, A., Avendaño, D., & Adrián-Ventura, J. (2021). Revised and short versions of the Pseudoscientific Belief Scale. Applied Cognitive Psychology, 35(3), 828-832. https://doi.org/10.1002/acp.3811

  • Fernandez-Duque, D., Evans, J., Christian, C., & Hodges, S. (2015). Superfluous neuroscience information makes explanations of psychological phenomena more appealing. Journal of Cognitive Neuroscience, 27(5), 926-944. https://doi.org/10.1162/jocn_a_00750

  • Fisher, M., Goddu, M. K., & Keil, F. C. (2015). Searching for explanations: How the Internet inflates estimates of internal knowledge. Journal of Experimental Psychology: General, 144(3), 674-687. https://doi.org/10.1037/xge0000070

  • Fisher, M., & Keil, F. (2014). The illusion of argument justification. Journal of Experimental Psychology, 143(1), 425-433. https://doi.org/10.1037/a0032234

  • Franks, B., Bangerter, A., Bauer, M. W., Hall, M., & Noort, M. C. (2017). Beyond “monologicality”? Exploring conspiracist worldviews. Frontiers in Psychology, 8, Article 861. https://doi.org/10.3389/fpsyg.2017.00861

  • Franz, T., & Green, K. (2013). The impact of an interdisciplinary learning community course on pseudoscientific reasoning in first-year science students. The Journal of Scholarship of Teaching and Learning, 13(5), 90-105. https://scholarworks.iu.edu/journals/index.php/josotl/article/view/3447

  • Garrett, B., & Cutting, R. (2017). Magical beliefs and discriminating science from pseudoscience in undergraduate professional students. Heliyon, 3(11), Article e00433. https://doi.org/10.1016/j.heliyon.2017.e00433

  • Gaviria, C., Corredor, J., & Zuluaga-Rendón, Z. (2017). “If it matters, I can explain it”: Social desirability of knowledge increases the illusion of explanatory depth. In G. Gunzelmann, A. Howes, T. Tenbrink, & E. Davelaar (Eds.), Proceedings of the 39th Annual Meeting of the Cognitive Science Society (CogSci), London, UK (pp. 2073–2078).

  • Gaze, C. (2014). Popular psychological myths: A comparison of students’ beliefs across the psychology major. The Journal of Scholarship of Teaching and Learning, 14(2), 46-60. https://doi.org/10.14434/josotl.v14i2.3931

  • Gelman, S., Ware, E., & Kleinberg, F. (2010). Effects of generic language on category content and structure. Cognitive Psychology, 61(3), 273-301. https://doi.org/10.1016/j.cogpsych.2010.06.001

  • Giese, H., Neth, H., Moussaïd, M., Betsch, C., & Gaissmaier, W. (2020). The echo in flu-vaccination echo chambers: Selective attention trumps social influence. Vaccine, 38(8), 2070-2076. https://doi.org/10.1016/j.vaccine.2019.11.038

  • Giffin, C., Wilkenfeld, D., & Lombrozo, T. (2017). The explanatory effect of a label: Explanations with named categories are more satisfying. Cognition, 168, 357-369. https://doi.org/10.1016/j.cognition.2017.07.011

  • Gruber, D., & Dickerson, J. (2012). Persuasive images in popular science: Testing judgments of scientific reasoning and credibility. Public Understanding of Science, 21(8), 938-948. https://doi.org/10.1177/0963662512454072

  • Hansson, S. O. (2009). Cutting the Gordian Knot of demarcation. International Studies in the Philosophy of Science, 23(3), 237-243. https://doi.org/10.1080/02698590903196007

  • Hansson, S. O. (2017). Science denial as a form of pseudoscience. Studies in History and Philosophy of Science, 63, 39-47. https://doi.org/10.1016/j.shpsa.2017.05.002

  • Hemmatian, B., & Sloman, S. (2018). Community appeal: Explanation without information. Journal of Experimental Psychology: General, 147(11), 1677-1712. https://doi.org/10.1037/xge0000478

  • Hogg, M., Meehan, C., & Farquharson, J. (2010). The solace of radicalism: Self-uncertainty and group identification in the face of threat. Journal of Experimental Social Psychology, 46(6), 1061-1066. https://doi.org/10.1016/j.jesp.2010.05.005

  • Hogg, M., & Wagoner, J. (2017). Uncertainty-identity theory. In V. Zeigler-Hill & T. Shackelfold (Eds.), Encyclopedia of personality and individual differences (pp. 73–76). Springer.

  • Hook, C., & Farah, M. (2013). Look again: Effects of brain images and mind-brain dualism on lay evaluations of research. Journal of Cognitive Neuroscience, 25(9), 1397-1405. https://doi.org/10.1162/jocn_a_00407

  • Hopkins, E., Weisberg, D., & Taylor, J. (2016). The seductive allure is a reductive allure: People prefer scientific explanations that contain logically irrelevant reductive information. Cognition, 155, 67-76. https://doi.org/10.1016/j.cognition.2016.06.011

  • Irwin, H., Dagnall, N., & Drinkwater, K. (2016). Dispositional scepticism, attitudes to science, and belief in the paranormal. Australian Journal of Parapsychology, 16(2), 117-131.

  • Kahan, D. (2013). Ideology, motivated reasoning, and cognitive reflection: An experimental study. Judgment and Decision Making, 8(4), 407-424. https://journal.sjdm.org/13/13313/jdm13313.pdf

  • Kahan, D. (2016). The politically motivated reasoning paradigm, part 1: What politically motivated reasoning is and how to measure it. In R. Scott & S. Kosslyn (Eds.), Emerging trends in the social and behavioral sciences: An interdisciplinary, searchable, and linkable resource (pp. 1–16). John Wiley & Sons.

  • Kahan, D., Jenkins‐Smith, H., & Braman, D. (2011). Cultural cognition of scientific consensus. Journal of Risk Research, 14(2), 147-174. https://doi.org/10.1080/13669877.2010.511246

  • Kahan, D., Peters, E., Wittlin, M., Slovic, P., Ouellette, L., Braman, D., & Mandel, G. (2012). The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Climate Change, 2(10), 732-735. https://doi.org/10.1038/nclimate1547

  • Khemlani, S., & Johnson-Laird, P. (2011). The need to explain. Quarterly Journal of Experimental Psychology, 64(11), 2276-2288. https://doi.org/10.1080/17470218.2011.592593

  • Khemlani, S., Sussman, A., & Oppenheimer, D. (2011). Harry Potter and the sorcerer’s scope: Latent scope biases in explanatory reasoning. Memory & Cognition, 39(3), 527-535. https://doi.org/10.3758/s13421-010-0028-1

  • Knobloch-Westerwick, S., Johnson, B., Silver, N., & Westerwick, A. (2015). Science exemplars in the eye of the beholder: How exposure to online science information affects attitudes. Science Communication, 37(5), 575-601. https://doi.org/10.1177/1075547015596367

  • König, L., & Jucks, R. (2019). Hot topics in science communication: Aggressive language decreases trustworthiness and credibility in scientific debates. Public Understanding of Science, 28(4), 401-416. https://doi.org/10.1177/0963662519833903

  • Kowalski, P., & Taylor, A. (2017). Reducing students’ misconceptions with refutational teaching: For long-term retention, comprehension matters. Scholarship of Teaching and Learning in Psychology, 3(2), 90-100. https://doi.org/10.1037/stl0000082

  • Kreiss, D. (2019). The fragmenting of the civil sphere: How partisan identity shapes the moral evaluation of candidates and epistemology. In J. Mast & J. Alexander (Eds.), Politics of meaning/meaning of politics: Cultural sociology (pp. 223–241). https://doi.org/10.1007/978-3-319-95945-0_13

  • Legrenzi, P., & Umiltà, C. (2011). Neuromania. Oxford University Press.

  • Lewandowsky, S., Cook, J., Fay, N., & Gignac, G. (2019). Science by social media: Attitudes towards climate change are mediated by perceived social consensus. Memory & Cognition, 47(8), 1445-1456. https://doi.org/10.3758/s13421-019-00948-y

  • Lewandowsky, S., Cook, J., & Lloyd, E. (2018). The ‘Alice in Wonderland’ mechanics of the rejection of (climate) science: Simulating coherence by conspiracism. Synthese, 195, 175-196. https://doi.org/10.1007/s11229-016-1198-6

  • Lewandowsky, S., Ecker, U., & Cook, J. (2017). Beyond misinformation: Understanding and coping with the “post-truth” era. Journal of Applied Research in Memory and Cognition, 6(4), 353-369. https://doi.org/10.1016/j.jarmac.2017.07.008

  • Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3), 106-131. https://doi.org/10.1177/1529100612451018

  • Lewandowsky, S., Gignac, G., & Vaughan, S. (2013). The pivotal role of perceived scientific consensus in acceptance of science. Nature Climate Change, 3(4), 399-404. https://doi.org/10.1038/nclimate1720

  • Lewandowsky, S., & Oberauer, K. (2016). Motivated rejection of science. Current Directions in Psychological Science, 25(4), 217-222. https://doi.org/10.1177/0963721416654436

  • Lewandowsky, S., & Oberauer, K. (2021). Worldview-motivated rejection of science and the norms of science. Cognition, 215, Article 104820. https://doi.org/10.1016/j.cognition.2021.104820

  • Lewandowsky, S., Pilditch, T., Madsen, J., Oreskes, N., & Risbey, J. (2019). Influence and seepage: An evidence-resistant minority can affect public opinion and scientific belief formation. Cognition, 188, 124-139. https://doi.org/10.1016/j.cognition.2019.01.011

  • Lewandowsky, S., & van der Linden, S. (2021). Countering misinformation and fake news through inoculation and prebunking. European Review of Social Psychology, 32(2), 348-384. https://doi.org/10.1080/10463283.2021.1876983

  • Lindeman, M., Svedholm-Häkkinen, A. M., & Lipsanen, J. (2015). Ontological confusions but not mentalizing abilities predict religious belief, paranormal belief, and belief in supernatural purpose. Cognition, 134, 63-76. https://doi.org/10.1016/j.cognition.2014.09.008

  • Lindeman, M., Svedholm-Häkkinen, A., & Riekki, T. (2016). Skepticism: Genuine unbelief or implicit beliefs in the supernatural? Consciousness and Cognition, 42, 216-228. https://doi.org/10.1016/j.concog.2016.03.019

  • Lombrozo, T. (2011). The instrumental value of explanations. Philosophy Compass, 6(8), 539-551. https://doi.org/10.1111/j.1747-9991.2011.00413.x

  • Lombrozo, T. (2016). Explanatory preferences shape learning and inference. Trends in Cognitive Sciences, 20(10), 748-759. https://doi.org/10.1016/j.tics.2016.08.001

  • Lupyan, G. (2012). Linguistically modulated perception and cognition: The label-feedback hypothesis. Frontiers in Psychology, 3, Article 54. https://doi.org/10.3389/fpsyg.2012.00054

  • Lupyan, G. (2017). The paradox of the universal triangle: Concepts, language, and prototypes. Quarterly Journal of Experimental Psychology, 70(3), 389-412. https://doi.org/10.1080/17470218.2015.1130730

  • Lupyan, G., Rakison, D., & McClelland, J. (2007). Language is not just for talking: Redundant labels facilitate learning of novel categories. Psychological Science, 18(12), 1077-1083. https://doi.org/10.1111/j.1467-9280.2007.02028.x

  • Lyddy, F., & Hughes, S. (2012). Attitudes towards psychology as a science and the persistence of psychological misconceptions in psychology undergraduates. In V. Karandashev & S. McCarthy (Eds.), Teaching psychology around the world (Vol. 3, pp. 330–349). Cambridge Scholars Publishing.

  • McCabe, D., & Castel, A. (2008). Seeing is believing: The effect of brain images on judgments of scientific reasoning. Cognition, 107(1), 343-352. https://doi.org/10.1016/j.cognition.2007.07.017

  • McKie, R. (2019). Climate change counter movement neutralization techniques: A typology to examine the climate change counter movement. Sociological Inquiry, 89(2), 288-316. https://doi.org/10.1111/soin.12246

  • Mede, N., & Schäfer, M. (2020). Science-related populism: Conceptualizing populist demands toward science. Public Understanding of Science, 29(5), 473-491. https://doi.org/10.1177/0963662520924259

  • Medimorec, S., & Pennycook, G. (2015). The language of denial: Text analysis reveals differences in language use between climate change proponents and skeptics. Climatic Change, 133(4), 597-605. https://doi.org/10.1007/s10584-015-1475-2

  • Michael, R., Newman, E., Vuorre, M., Cumming, G., & Garry, M. (2013). On the (non)persuasive power of a brain image. Psychonomic Bulletin & Review, 20(4), 720-725. https://doi.org/10.3758/s13423-013-0391-6

  • Mills, C., & Keil, F. (2004). Knowing the limits of one’s understanding: The development of an awareness of an illusion of explanatory depth. Journal of Experimental Child Psychology, 87(1), 1-32. https://doi.org/10.1016/j.jecp.2003.09.003

  • Mills, C., Sands, K., Rowles, S., & Campbell, I. (2019). “I want to know more!”: Children are sensitive to explanation quality when exploring new information. Cognitive Science, 43(1), Article e12706. https://doi.org/10.1111/cogs.12706

  • Morier, D., & Keeports, D. (1994). Normal science and the paranormal: The effect of a scientific method course on students’ beliefs. Research in Higher Education, 35(4), 443-453. https://doi.org/10.1007/BF02496382

  • Norenzayan, A., & Gervais, W. (2013). The origins of religious disbelief. Trends in Cognitive Sciences, 17(1), 20-25. https://doi.org/10.1016/j.tics.2012.11.006

  • Nyhan, B., & Reifler, J. (2015). Does correcting myths about the flu vaccine work? An experimental evaluation of the effects of corrective information. Vaccine, 33(3), 459-464. https://doi.org/10.1016/j.vaccine.2014.11.017

  • Nyhan, B., Reifler, J., Richey, S., & Freed, G. (2014). Effective messages in vaccine promotion: A randomized trial. Pediatrics, 133(4), e835-e842. https://doi.org/10.1542/peds.2013-2365

  • Omer, S., Salmon, D., Orenstein, W., deHart, M., & Halsey, N. (2009). Vaccine refusal, mandatory immunization, and the risks of vaccine-preventable diseases. The New England Journal of Medicine, 360(19), 1981-1988. https://doi.org/10.1056/NEJMsa0806477

  • Palm, R., Lewis, G., & Feng, B. (2017). What causes people to change their opinion about climate change? Annals of the Association of American Geographers, 107(4), 883-896. https://doi.org/10.1080/24694452.2016.1270193

  • Pariser, E. (2011). The filter bubble: What the internet is hiding from you. Penguin.

  • Pennycook, G., Cheyne, J., Barr, N., Koehler, D., & Fugelsang, J. (2015). On the reception and detection of pseudo-profound bullshit. Judgment and Decision Making, 10(6), 549-563. https://journal.sjdm.org/15/15923a/jdm15923a.pdf

  • Pennycook, G., Cheyne, J., Seli, P., Koehler, D., & Fugelsang, J. (2012). Analytic cognitive style predicts religious and paranormal belief. Cognition, 123(3), 335-346. https://doi.org/10.1016/j.cognition.2012.03.003

  • Pigliucci, M., & Boudry, M. (2013). Philosophy of pseudoscience: Reconsidering the demarcation problem. University of Chicago Press.

  • Racine, E., Waldman, S., Rosenberg, J., & Illes, J. (2010). Contemporary neuroscience in the media. Social Science & Medicine, 71(4), 725-733. https://doi.org/10.1016/j.socscimed.2010.05.017

  • Rekker, R. (2021). The nature and origins of political polarization over science. Public Understanding of Science, 30(4), 352-368. https://doi.org/10.1177/0963662521989193

  • Rhodes, R., Rodríguez, F., & Shah, P. (2014). Explaining the alluring influence of neuroscience information on scientific reasoning. Journal of Experimental Psychology: Learning, Memory, and Cognition, 40(5), 1432-1440. https://doi.org/10.1037/a0036844

  • Rozenblit, L., & Keil, F. (2002). The misunderstood limits of folk science: An illusion of explanatory depth. Cognitive Science, 26(5), 521-562. https://doi.org/10.1207/s15516709cog2605_1

  • Scharrer, L., Rupieper, Y., Stadtler, M., & Bromme, R. (2017). When science becomes too easy: Science popularization inclines laypeople to underrate their dependence on experts. Public Understanding of Science, 26(8), 1003-1018. https://doi.org/10.1177/0963662516680311

  • Schweitzer, N., Baker, D., & Risko, E. (2013). Fooled by the brain: Re-examining the influence of neuroimages. Cognition, 129(3), 501-511. https://doi.org/10.1016/j.cognition.2013.08.009

  • Scurich, N., & Shniderman, A. (2014). The selective allure of neuroscientific explanations. PLoS One, 9(9), Article e107529. https://doi.org/10.1371/journal.pone.0107529

  • Sloman, D., & Fernbach, P. (2017). The knowledge illusion: Why we never think alone. Riverhead.

  • Sloman, S. A., & Rabb, N. (2016). Your understanding is my understanding: Evidence for a community of knowledge. Psychological Science, 27(11), 1451-1460. https://doi.org/10.1177/0956797616662271

  • Sugarman, H., Impey, C., Buxner, S., & Antonellis, J. (2011). Astrology beliefs among undergraduate students. Astronomy Education Review, 10(1), Article 010101-010101-9. https://doi.org/10.3847/AER2010040

  • Svedholm-Häkkinen, A., & Lindeman, M. (2013). The separate roles of the reflective mind and involuntary inhibitory control in gatekeeping paranormal beliefs and the underlying intuitive confusions. British Journal of Psychology, 104(3), 303-319. https://doi.org/10.1111/j.2044-8295.2012.02118.x

  • Swami, V., Coles, R., Stieger, S., Pietschnig, J., Furnham, A., Rehim, S., & Voracek, M. (2011). Conspiracist ideation in Britain and Austria: Evidence of a monological belief system and associations between individual psychological differences and real-world and fictitious conspiracy theories. British Journal of Psychology, 102(3), 443-463. https://doi.org/10.1111/j.2044-8295.2010.02004.x

  • Tabacchi, M. E., & Cardaci, M. (2016). Preferential biases for texts that include neuroscientific jargon. Psychological Reports, 118(3), 793-803. https://doi.org/10.1177/0033294116649000

  • Tappin, B. M., Pennycook, G., & Rand, D. G. (2021). Rethinking the link between cognitive sophistication and politically motivated reasoning. Journal of Experimental Psychology: General, 150(6), 1095-1114. https://doi.org/10.1037/xge0000974

  • Thomm, E., & Bromme, R. (2012). “It should at least seem scientific!” Textual features of “scientificness” and their impact on lay assessments of online information. Science Education, 96(2), 187-211. https://doi.org/10.1002/sce.20480

  • Tobacyk, J. (2004). A revised Paranormal Belief Scale. International Journal of Transpersonal Studies, 23(1), 94-98. https://doi.org/10.24972/ijts.2004.23.1.94

  • Torres, M., Barberia, I., & Rodríguez-Ferreiro, J. (2020). Causal illusion as a cognitive basis of pseudoscientific beliefs. British Journal of Psychology, 111(4), 840-852. https://doi.org/10.1111/bjop.12441

  • van der Linden, S., Leiserowitz, A., & Maibach, E. (2018). Scientific agreement can neutralize politicization of facts. Nature Human Behaviour, 2(1), 2-3. https://doi.org/10.1038/s41562-017-0259-2

  • van Prooijen, J. (2016). Sometimes inclusion breeds suspicion: Self‐uncertainty and belongingness predict belief in conspiracy theories. European Journal of Social Psychology, 46(3), 267-279. https://doi.org/10.1002/ejsp.2157

  • van Prooijen, J. (2020). An existential threat model of conspiracy theories. European Psychologist, 25(1), 16-25. https://doi.org/10.1027/1016-9040/a000381

  • van Prooijen, J., & Jostmann, N. (2013). Belief in conspiracy theories: The influence of uncertainty and perceived morality. European Journal of Social Psychology, 43(1), 109-115. https://doi.org/10.1002/ejsp.1922

  • Vasilyeva, N., Wilkenfeld, D., & Lombrozo, T. (2017). Contextual utility affects the perceived quality of explanations. Psychonomic Bulletin & Review, 24(5), 1436-1450. https://doi.org/10.3758/s13423-017-1275-y

  • Vilela, L., & Álvarez, C. (2004). Differences in paranormal beliefs across fields of study from a Spanish adaptation of Tobacyk’s RPBS. The Journal of Parapsychology, 68(2), 405-422.

  • Vitriol, J., & Marsh, J. (2018). The illusion of explanatory depth and endorsement of conspiracy beliefs. European Journal of Social Psychology, 48(7), 955-969. https://doi.org/10.1002/ejsp.2504

  • Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146-1151. https://doi.org/10.1126/science.aap9559

  • Weisberg, D., Hopkins, E., & Taylor, J. (2018). People’s explanatory preferences for scientific phenomena. Cognitive Research: Principles and Implications, 3(1), Article 44. https://doi.org/10.1186/s41235-018-0135-2

  • Weisberg, D., Keil, F., Goodstein, J., Rawson, E., & Gray, J. (2008). The seductive allure of neuroscience explanations. Journal of Cognitive Neuroscience, 20(3), 470-477. https://doi.org/10.1162/jocn.2008.20040

  • Weisberg, D., Taylor, J., & Hopkins, E. (2015). Deconstructing the seductive allure of neuroscience explanations. Judgment and Decision Making, 10(5), 429-441. https://journal.sjdm.org/15/15731a/jdm15731a.pdf

  • Wilson, J. (2018). Reducing pseudoscientific and paranormal beliefs in university students through a course in science and critical thinking. Science & Education, 27(1–2), 183-210. https://doi.org/10.1007/s11191-018-9956-0

  • Wilson, R., & Keil, F. (1998). The shadows and shallows of explanation. Minds and Machines, 8(1), 137-159. https://doi.org/10.1023/A:1008259020140

  • Wood, M., Douglas, K., & Sutton, R. (2012). Dead and alive: Beliefs in contradictory conspiracy theories. Social Psychological & Personality Science, 3(6), 767-773. https://doi.org/10.1177/1948550611434786

  • Wood, T., & Porter, E. (2019). The elusive backfire effect: Mass attitudes’ steadfast factual adherence. Political Behavior, 41(1), 135-163. https://doi.org/10.1007/s11109-018-9443-y

  • Zaboski, B., & Therriault, D. (2020). Faking science: Scientificness, credibility, and belief in pseudoscience. Educational Psychology, 40(7), 820-837. https://doi.org/10.1080/01443410.2019.1694646

  • Zemla, J., Sloman, S., Bechlivanidis, C., & Lagnado, D. (2017). Evaluating everyday explanations. Psychonomic Bulletin & Review, 24(5), 1488-1500. https://doi.org/10.3758/s13423-017-1258-z