Theoretical Articles

Mistrust and Misinformation: A Two-Component, Socio-Epistemic Model of Belief in Conspiracy Theories

Joseph M. Pierre*a

Journal of Social and Political Psychology, 2020, Vol. 8(2), 617–641, https://doi.org/10.5964/jspp.v8i2.1362

Received: 2019-12-02. Accepted: 2020-05-25. Published (VoR): 2020-10-12.

Handling Editor: Müjde Peker, MEF University, Istanbul, Turkey

*Corresponding author at: West Los Angeles VA Medical Center, 11301 Wilshire Blvd., Building 210, Room 15, Los Angeles, CA, 90073, USA. E-mail: joseph.pierre2@va.gov

This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Although conspiracy theories are endorsed by about half the population and occasionally turn out to be true, they are more typically false beliefs that, by definition, have a paranoid theme. Consequently, psychological research to date has focused on determining whether there are traits that account for belief in conspiracy theories (BCT) within a deficit model. Alternatively, a two-component, socio-epistemic model of BCT is proposed that seeks to account for the ubiquity of conspiracy theories, their variance along a continuum, and the inconsistency of research findings likening them to psychopathology. Within this model, epistemic mistrust is the core component underlying conspiracist ideation that manifests as the rejection of authoritative information, focuses the specificity of conspiracy theory beliefs, and can sometimes be understood as a sociocultural response to breaches of trust, inequities of power, and existing racial prejudices. Once voices of authority are negated due to mistrust, the resulting epistemic vacuum can send individuals “down the rabbit hole” looking for answers where they are vulnerable to the biased processing of information and misinformation within an increasingly “post-truth” world. The two-component, socio-epistemic model of BCT argues for mitigation strategies that address both mistrust and misinformation processing, with interventions for individuals, institutions of authority, and society as a whole.

Keywords: conspiracy theory, conspiracist ideation, epistemic mistrust, misinformation, inter-group conspiracy theories, post-truth

Conspiracy theories reject authoritative accounts of reality in favor of some plot involving a group of people with malevolent intent that is deliberately kept secret from the public. Consistent with this definition, a two-component, socio-epistemic model is proposed whereby belief in conspiracy theories (BCT) can be understood as a result of mistrusting conventional knowledge and authoritative accounts (“epistemic mistrust”) along with a biased appraisal of false counter-narratives (“misinformation processing”). This model provides an overarching framework to better understand BCT more universally as a potentially normal socio-epistemic phenomenon, as well as from an individual standpoint based on a focused specificity of mistrust and exposure to misinformation.

Conspiracy Theories as Psychopathology

Over the past decade, conspiracy theories have breached mainstream awareness with the help of the internet and have garnered increasing attention based on their potential to steer both individual behavior and public policy into irrational directions and to cause harm (Bogart et al., 2010; Jolley & Douglas, 2014a; Jolley & Douglas, 2014b; Jolley et al., 2019; Jolley, Meleady, & Douglas, 2020; Oliver & Wood, 2014b). Consequently, research on conspiracy theories has emerged as a rapidly growing field straddling the psychological, political, and informational sciences with discoveries about etiologies and potential interventions that have significant practical relevance.

Much of the psychological research on conspiracy theories to date has focused on the search for associations between conspiracy theories and various psychological traits or cognitive biases that might account for why some people believe in them. For example, BCT has been correlated with higher levels of certain attribution and perceptual biases (Douglas et al., 2016; van Elk, 2015; van Prooijen, Douglas, & De Inocencio 2018, Wagner-Egger et al., 2018); conjunction fallacies (Brotherton & French, 2014); need for control, certainty, cognitive closure, and uniqueness (Imhoff & Lamberty, 2017; Lantian et al., 2017; Marchlewska, Cichocka, & Kossowska, 2018; Newheiser, Farias, & Tausch, 2011; van Prooijen & Acker, 2015); “bullshit receptivity” and lack of analytic thinking (Hart & Graether, 2018; Ståhl & van Prooijen, 2018; Swami et al., 2014); and paranoia/schizotypy (Dagnall et al., 2015; Darwin, Neave, & Holmes, 2011; Grzesiak-Feldman & Ejsmont, 2008; Hart & Graether, 2018).

This research approach has several limitations. First, it is grounded in the premise that BCT represents a kind of psychopathology (e.g. the “deficit model”), if not evidence of delusion or formal mental illness per se. However, surveys have found that the majority of respondents believe in at least one conspiracy theory (Goertzel, 1994; Oliver & Wood, 2014a, 2014b), suggesting that BCT, or what has been called generalized “conspiracist ideation” (Swami et al., 2011), is essentially a normal phenomenon. Indeed, many of the cognitive biases and other psychological quirks that have been found to be associated with BCT are universal, continuously-distributed traits varying in quantity as opposed to all-or-none variables or distinct symptoms of mental illness. They are present in those who do not believe in conspiracy theories and some of them, like need for uniqueness or closure, may be valued or adaptive in certain culturally-mediated settings (Cai et al., 2018; Kossowska, Dragon, & Bukowski, 2015). Others, like schizotypy, might be more tautological or redundant to BCT rather than an independent or interactive cause (Hart & Graether, 2018).

Another limitation of research on conspiracy theories to date is that associations between BCT and individual psychological variables have been inconsistent across studies. For example, some studies have not found any association between BCT and cognitive closure (Imhoff & Bruder, 2014; Leman & Cinnirella, 2013) or need for certainty (Moulding et al., 2016). Other studies have reported associations between BCT and specific Big Five personality traits (Swami, Chamarro-Premuzic, & Furnham, 2010), whereas a recent meta-analysis found no such association when effect sizes were aggregated (Goreis & Voracek, 2019). Lower levels of education have been found to be associated with BCT in some studies (Douglas et al., 2016; Federico, Williams, & Vitriol, 2018; Green & Douglas, 2018; Mancosu, Vassallo, & Vezzoni, 2017; Oliver & Wood, 2014a; Richey, 2017; Stempel, Hargrove, & Stempel, 2007), but not others (Bogart et al., 2010; Goertzel, 1994; Simmons & Parsons, 2005), suggesting more complex associations with other variables (van Prooijen, 2017). Overall, such inconsistencies likely stem from the fact that researchers have utilized different surveys and scales to measure BCT while exploring associations with a limited number of individual psychological variables within any single study (Brotherton, French, & Pickering, 2013; Goreis & Voracek, 2019; Hart & Graether, 2018).

The heterogeneity of cognitive peculiarities associated with BCT might also be explained by the fact that, with some notable exceptions, research on conspiracy theories often examines BCT in general rather than within specific or individual conspiracy theories. This broad approach has presumably been rationalized based on findings that belief in one conspiracy theory predicts belief in others (Goertzel, 1994; Lewandowsky, Gignac, & Oberauer, 2013), even when they are contradictory (Wood, Douglas, & Sutton, 2012) or contrived by researchers (Swami et al., 2011). However, the idea that BCT reflects a “monological belief system” or worldview whereby conspiracy theories are justified by other conspiracy theories (Goertzel, 1994) may be short-sighted (Douglas, Uscinski, et al., 2019; Franks et al., 2017; Klein, Clutton, & Polito, 2018) and in any case steers research away from the possibility that some psychological traits might be more likely to account for beliefs in some specific conspiracies, but not others. For example, need for closure and certainty might help to explain conspiracy theories emerging in the wake of “crisis situations” such as the assassination of JFK or the death of Princess Diana (van Prooijen & Douglas, 2017), but are less obviously relevant to beliefs in a flat Earth, “chem trails,” or ”pizzagate.” These are testable hypotheses that have for the most part not been explored in research to date, suggesting the need to examine individual conspiracy theory beliefs more closely in order to avoid unsubstantiated generalizations (Enders & Smallpage, 2018; Uscinski, Klofstad, & Atkinson, 2016).

The diversity of conspiracy theories deserves greater attention in research (Franks, Bangerter, & Bauer, 2013; Franks et al., 2017; Klein, Clutton, & Polito, 2018). For example, the heterogeneity of BCT may be best understood by not only examining different conspiracy theory themes, but also according to other cognitive dimensions. Research on delusional thinking suggests that different delusions have distinct pathophysiologies with some common underlying features (Corlett et al., 2010) and that individuals with thematically similar delusions vary quantitatively based on different cognitive dimensions such as conviction, preoccupation, distress, or functional impact (Peters, Joseph, & Garety, 1999). This is likely true of normal beliefs and misbeliefs including those related to religion, politics, and conspiracy theories as well (Barnby et al., 2019; Franks, Bangerter, & Bauer, 2013; Pierre, 2001). Indeed, factor analytic studies have provided evidence for distinct “facets” of conspiracy theories, suggesting that they are more multidimensional than is generally appreciated and that mapping out these dimensions could help to disentangle the inconsistent findings of existing research (Brotherton, French, & Pickering, 2013; Castanho Silva, Vegetti, & Littvay, 2017). For example, quantifying BCT along the dimension of conviction could be especially useful in distinguishing those who question authority and search for answers through online “research” (e.g. “fence sitters” with vaccine hesitancy) and those with full-blown conspiracy theory belief (e.g. “flat-Earthers” convinced of an international cover-up). Such quantitative dimensionalization may help to elucidate differences between general conspiracist ideation, more specific conspiracy theory endorsement, and points in between (Federico, Williams, & Vitriol, 2018; Franks et al., 2017).

Raab, Ortlieb, et al. (2013, p. 2) have noted that existing conspiracy theory questionnaires may draw “an artificial red line between believers and nonbelievers” of conspiracy theories, whereas BCT actually represents a narrative with “thirty shades of truth.” In keeping with this view, Stojanov and Halberstadt (2019) have recently developed a scale that captures both irrational “conspiracy theory ideation” (as measured by statements like “some things that everyone accepts as true are in fact hoaxes created by people in power”) and rational “skepticism” (e.g. “some things are not as they seem”) as separable components of BCT. In their validation study, skepticism was associated with belief in accounts of plausible accounts of generic corruption, while only conspiracist ideation predicted belief in specific implausible conspiracy theories. These findings support a model in which BCT exists on a continuum and that understanding the dimensional factors that determine where a belief lies on that continuum is an essential direction of future research.

A Two-Component, Socio-Epistemic Model of Belief in Conspiracy Theories

A “two-factor model” of delusional thinking has been proposed to explain 1) why an implausible idea comes to mind in the first place and 2) why the belief is subsequently adopted rather than rejected (Davies et al., 2001; Langdon, 2011). Although the two-factor model remains unvalidated and is not without its critics (Corlett, 2019), it provides an analogous framework to explain other types of misbelief (Connors & Halligan, 2015; McKay & Dennett, 2009) including conspiracy theories. It has been noted that BCT scales used in research are designed to measure both “positive” beliefs that “malevolent groups are conspiring” and “negative” beliefs that “official accounts are false” (Douglas, Uscinski, et al., 2019, p. 6). Accordingly, a two-component, socio-epistemic model of conspiracy theories is proposed that includes distinct elements of belief negation (why one goes looking for conspiracy theories in the first place) and affirmation (why one comes to believe a particular conspiracy theory).

Component 1: Epistemic Mistrust

As noted above, previous research has suggested that generalized conspiracist ideation is a unifying force underlying all BCT (Swami et al., 2011). Conspiracist ideation has been characterized as the attribution of events to hidden, intentional forces; “a natural attraction” to “Manichean narratives” related to universal struggles between good and evil (Oliver & Wood, 2014a, p. 954); or as belief in the “essential malevolence” and deceptiveness of “officialdom” (Wood, Douglas, & Sutton, 2012, p. 772). Indeed, some researchers have developed a single-item BCT questionnaire that describes the possibility that social and political events “have been planned and secretly prepared by a covert alliance of powerful individuals or organizations” and quantifies agreement with the single statement, “I think that the official version of the events given by the authorities very often hides the truth” (Lantian et al., 2016, p. 10).

As an explanation for BCT, general conspiracist ideation has two notable shortcomings. First, it is limited by tautology (i.e. it suggests that “conspiracy theorists” prefer conspiracy theories because they find conspiracy theories appealing). Second, it implies that BCT is understandable as an individual psychological phenomenon divorced from any socio-cultural context (Klein, Clutton, & Dunn, 2019; Stempel, Hargrove, & Stempel, 2007). The two-component, socio-epistemic model of BCT proposed here instead argues that conspiracist ideation is better conceptualized as a result of epistemic mistrust. As a universal first component of BCT, epistemic mistrust refers to mistrust of knowledge or, framed within its proper socio-cultural context, mistrust of authoritative informational accounts. Epistemic mistrust is often rooted in interpersonal mistrust and can vary on a continuum from generic skepticism or epistemic vigilance on one end (Sperber et al., 2010), to a middle-ground of suspiciousness including the kind of subclinical paranoid ideation seen in schizotypy (Darwin, Neave, & Holmes, 2011; Grzesiak-Feldman & Ejsmont, 2008; Imhoff & Lamberty, 2018; Wood, 2017), to a more specific and potentially pathological focus on allegedly interconnected institutions of authority or epistemic nihilism at the extreme. According to this view, BCT does not represent a primary attraction to conspiracist narratives so much as a rejection of authoritative accounts, accepted explanations, and conventional wisdom.

The idea that mistrust lies at the root of BCT is not new (Hofstadter, 1964; Basham, 2001) and is supported by the few studies that have variably defined it as a correlational variable (Abalakina-Paap et al., 1999; Goertzel, 1994; Golec de Zavala & Federico, 2018; Green & Douglas, 2018; Parsons et al., 1999). Wood and Douglas (2013, p. 7) likewise found that for 9/11 conspiracy theory adherents, “the search for truth consists mostly of finding ways in which the official story cannot be true… [with] much less of a focus on defending coherent explanations that can better account for the available evidence.” Despite these associations between mistrust and BCT however, the significance of epistemic mistrust may be underappreciated due to its limited use as an inconsistently defined variable as well as methodological assumptions that it represents a generalized individual trait like paranoia rather than something specific to its social context. For example, generalized interpersonal mistrust is significantly associated with some types of conspiracy beliefs, but not others (Oliver & Wood, 2014a). Mistrust in government has been found to be associated with general conspiracist ideation and political conspiracy theories (Imhoff & Lamberty, 2018; Richey, 2017), but not conspiracy theories related to other themes (van Prooijen & Acker, 2015). Other studies have found BCT to be specifically associated with mistrust of authority or “high power” groups (Imhoff & Bruder, 2014; Imhoff, Lamberty, & Klein, 2018; Swami, Chamarro-Premuzic, & Furnham, 2010; Swami et al., 2011). These inconsistencies suggest that while general mistrust is associated with BCT, endorsement of specific conspiracy theories may depend on a continuum of mistrust that can be narrowly focused on individual institutions of epistemic authority.

An “ascending typology” has been proposed that charts different phases of BCT progression from stages of skeptically questioning official narratives to full-blown embrace of collective conspiracy theories and the belief that all reality is an illusion (Franks et al., 2017). Mistrust runs through these narrative accounts as a central component, supporting the view that BCT represents a continuum. According to this perspective, general conspiracist ideation that conforms with a “monological belief system” may represent an extreme that can account for the endorsement of logically contradictory conspiracy theories (Wood, Douglas, & Sutton, 2012). In contrast, epistemic mistrust can be much more focused on specific informational sources or institutions of epistemic authority including individual governments or political parties, scientific organizations, or corporations. This potentially narrow specificity of mistrust helps to account for the diversity of conspiracy theory themes and the fact that while belief in one conspiracy might predict belief in another, it does not predict belief in all others (Goertzel, 1994; Klein, Clutton, & Polito, 2018; Mancosu, Vassallo, & Vezzoni, 2017; Oliver & Wood, 2014a, 2014b; Stojanov & Halberstadt, 2019).

Indeed, contrary to the monological belief system model, BCT tends not to include ideologically inconsistent conspiracy theories that cross political party lines (Miller, Saunders, & Farhart, 2016; Oliver & Wood, 2014a). Smallpage and colleagues (2017, p. 4) reported evidence from online survey data to support the conclusion that “although conspiracy theories are often attributed to cognitive hiccups, psychological traits, or psychopathologies, they actually follow the contours of more familiar partisan battles in the age of polarization.” Studies have revealed that while “general” conspiracy theories may be equally endorsed among liberals and conservatives, the endorsement of politically-relevant “ideological” conspiracies tend to be aligned with self-reported political orientation (Enders & Smallpage, 2018; Enders, Smallpage, & Lupton, 2018; Miller, Saunders, & Farhart, 2016; Oliver & Wood, 2014a; Smallpage, Enders, & Uscinski 2017, Stempel, Hargrove, & Stempel, 2007). Others have found evidence of associations between BCT and populism (Castanho Silva, Vegetti, & Littvay, 2017) or with extremism on both sides of the political fence (Krouwel et al., 2017; van Prooijen, Krouwel, & Polet, 2015) rather than any unidirectional left/right bias. Collectively, these findings support the argument that political conspiracy theories are best understood as a form of ideologically motivated reasoning, with trust demonstrated to be an important mitigating factor that can moderate motivated reasoning within BCT (Miller, Saunders, & Farhart, 2016; Saunders, 2017).

Cultural Mistrust, Trust Violations, and Conspiracy Theories

If BCT is rooted in mistrust, the inverse corollary is that trust underlies much of our belief in conventional wisdom, voices of epistemic authority, or scientific consensus. Although some epistemologists have claimed that trust and knowledge are antithetical, with trust implying ignorance, it has been argued that on the contrary, “trust in the testimony of others is necessary to ground much of our knowledge, and that this trust involves trust in the character of testifiers” (Hardwig, 1991, p. 702). It has been further noted that “trust relationships require an active choice on behalf of the trusting party” (Larson et al., 2018, p. 1599) involving “leaps of faith” (Brownlie & Howson, 2005). With particular relevance to medical conspiracy theory beliefs, Whyte and Crease (2010, p. 412) proposed that “trust means deferring with comfort and confidence to others, about something beyond our knowledge or power, in ways that can potentially hurt us.”

Although BCT is often modeled as pathological, with mistrust all but equated with paranoia, an important, if sometimes neglected, counterexample is well-illustrated by numerous studies showing that BCT is common within the African American population (Goertzel, 1994; Stempel, Hargrove, & Stempel, 2007). Among the most common conspiracy theories in this population are those related to themes of “benign neglect,” which are predicted by generalized mistrust and having been a victim of police harassment, without correlation to political alignment (Parsons et al., 1999; Simmons & Parsons, 2005). The intersection of epistemic mistrust and BCT within the African American community has been explored more specifically within Human Immunodeficiency Virus (HIV)/Acquired Immunodeficiency Syndrome (AIDS)-related conspiracy theories such as beliefs that HIV is a man-made virus created and spread by the Central Intelligence Agency or that treatments are either withheld from the African American community or cause rather than treat AIDS (Ball, Lawson, & Alim, 2013). A survey of 500 US African Americans found that over half believed that AIDS information is withheld from the public and that a cure for AIDS exists but is being withheld from the poor, with nearly half believing that HIV is a man-made virus and that those taking new antiviral medications are “human guinea pigs for the government” (Bogart & Thorburn, 2005). HIV-related conspiracy beliefs have been found to be more prevalent among African Americans compared to Whites (Clark et al., 2008; Westergaard et al., 2014) and belief in at least one HIV-related conspiracy theory has been reported in over 60% of HIV positive African Americans (Bogart et al., 2010; Bogart et al., 2016). Such conspiracy theories have important implications for healthcare based on associations between their belief and lower rates of condom use (Bogart & Thorburn, 2005), preexposure HIV prophylaxis (Brooks et al., 2018), and antiviral treatment adherence for those who are HIV positive (Bogart et al., 2010; Bogart et al., 2016).

Medical mistrust and HIV-related conspiracy beliefs within the African American community have been attributed to misinformation transmitted by word of mouth through social networks (Bogart et al., 2016; Parsons et al., 1999) as “narratives and contemporary legends” rooted in historical experiences and healthcare disparities that have engendered mistrust (Heller, 2015). Indeed, independent of BCT, it has been found that African American mistrust of white people is negatively correlated with knowledge about HIV transmission (Klonoff & Landrine, 1997) and that general medical mistrust is associated with lower medication adherence with HIV medications (Dale et al., 2016). Although mistrust within the African American community has been historically labeled “cultural paranoia” as if a symptom of pathology, the community’s modern mistrust in medicine may stem from lived experiences such as the 1932-1979 Tuskegee Syphilis Study if not others that predate it (Ball, Lawson, & Alim, 2013; Heller, 2015). “Cultural paranoia” has therefore been reframed as “cultural mistrust” (Whaley, 2001), defined as a “racism reaction” characterized by “a tendency to distrust Whites based upon a legacy of direct or vicarious exposure to racism or unfair treatment by Whites” (Thompson et al., 2004, p. 210). Although mistrust involves a socio-cognitive appraisal that is potentially prone to error on a continuum with paranoia, both generalized medical mistrust as well as more specific HIV-related conspiracy theories within the African American community highlight that it can also arise from lived experience including violations of trust and chronic social devaluation (Davis, Wetherell, & Henry, 2018). In other words, epistemic mistrust is often associated with interpersonal and cultural mistrust that need not be pathological and on the contrary is, like trust, often earned.

Societal Threat, Intergroup Conspiracies, and Racism

In addition to institutional neglect or abuse, known instances of corruption and occasions when conspiracy theories turn out to be true provide epistemic justification for mistrust and BCT. In recognition of the sociopolitical forces driving conspiracy theories, some authors have normalized them as “general political attitudes” (Imhoff & Bruder, 2014) or merely “another type of political discourse” (Oliver & Wood, 2014a). Indeed, since conspiracy theories are essentially a form of “counter-discourse” that opposes official explanations (Sapountzis & Condor, 2013) and “speaks (un)truth to power” (Imhoff & Bruder, 2014), it could be argued that all conspiracy theories arise out of power inequities such that they are often related to a political theme.

Some authors have suggested that more specific mistrust of those with political power is a defining feature of conspiracy theories (Imhoff & Bruder, 2014; Imhoff & Lamberty, 2018), with studies reporting associations between BCT and political cynicism, feelings of powerlessness, and anomie (Abalakina-Paap et al., 1999; Goertzel, 1994; Swami, Chamarro-Premuzic, & Furnham, 2010; Swami et al., 2011). However, conspiracy theories are hardly the exclusive domain of those lacking in political power, as evidenced by their embrace by some populist dictators and political parties whether in power or not (Bergmann, 2018; Castanho Silva, Vegetti, & Littvay, 2017; YouGov, 2018). Additionally, it is often minority groups lacking in power that find themselves the “targets” rather than the “perpetrators” of intergroup conspiracies theories.

It has been observed that conspiracy theories frequently emerge in the setting of political upheaval and societal crises such that they are inevitable in the wake of pivotal traumas such as the death of JFK or 9/11 (van Prooijen & Douglas, 2017). Such an association suggests that cognitive and emotional needs for closure, control, or security may be especially relevant to BCT at such times (Newheiser, Farias, & Tausch, 2011; van Prooijen & Douglas, 2017). This may be especially true when official explanations for such events are found to be wanting, either because they are still being formulated or are lacking altogether (Marchlewska, Cichocka, & Kossowska, 2018) or when there is a mismatch between the magnitude of the event (e.g. JFK’s assassination) and the available explanation (e.g. a lone gunman). The two-component, socio-epistemic model of BCT further hypothesizes that uncertainty and loss of control should be understood in relation to loss of trust in the institutions that are expected to keep people safe. This dynamic may have particular relevance to intergroup conspiracies that have been linked to both a need for control and a need to blame others (Chayinska & Minescu, 2018; Imhoff & Bruder, 2014; Kofta, Soral, & Bilewicz, 2020).

Swami (2012, p. 7) found that within a Malaysian sample, belief in a conspiracy theory about Jews was not so much related to belief in other conspiracy theories or general conspiracist ideation as it was to “ideological needs” and “specific political expressions within a particular geopolitical context.” Within an intergroup setting, such ideological needs have been identified as needs for control or to blame, along with ingroup social identification and collective narcissism (Cichocka et al., 2016; Federico, Williams, & Vitriol, 2018; Golec de Zavala & Federico, 2018; Mashuri & Zaduqisti, 2014a; Sapountzis & Condor, 2013). However, while ingroup narcissism has been found to predict BCT about outgroups, it appears to confer a kind of immunity against BCT related to one’s own ingroup (Cichocka et al., 2016). Instead, intergroup mistrust may serve as a powerful mediator between loss of control and BCT where blame is redirected from ingroup targets to mistrusted outgroups (Mashuri & Zaduqisti, 2014b).

The relevance of mistrust to intergroup conspiracy theories becomes rather more complicated when examining diverse groups operating within larger societies. For example, contrary to expectations, mistrust of one’s own government as measured by belief in political corruption has not been found to necessarily increase BCT about one’s own ingroup (Chayinska & Minescu, 2018). Instead, it appears that “system identity threats” – fears that an ingroup society is coming undone – can increase BCT about the malevolent intentions of minority ingroup members through a self-protective mechanism (Jolley, Douglas, & Sutton, 2018). Such dynamics have been used to argue that intergroup conspiracy theories play a scapegoating role in the service of maintaining a sense of control, displacing epistemic mistrust in the face of political upheaval to a minority group falsely imbued with omnipotence and nefarious intentions (Federico, Williams, & Vitriol, 2018; Imhoff & Bruder, 2014; Jolley, Douglas, & Sutton, 2018; Kofta, Soral, & Bilewicz, 2020). In this fashion, epistemic mistrust is often redirected at pre-existing targets of interpersonal and cultural mistrust in the service of protecting narcissistic needs. In other words, mistrust based on “conspiracy stereotypes” (Kofta & Sedek, 2005) can determine who becomes a scapegoat within and across societies. This intergroup dynamic is further supported by research demonstrating that at the level of the individual, the attribution of exaggerated influence and power to personal and political enemies serves to compensate for threats to control (Sullivan, Landau, & Rothschild, 2010).

Such false attributions of power and malevolent intent contrast with epistemic mistrust on the part of minority groups that stem from actual trust violations and racial oppression discussed in the previous section. When minority groups are the targets of conspiracy theories, a fine line separates mechanisms designed to maintain feelings of ingroup control or narcissism from frank racism. There can also be a slippery slope where BCT takes the form of more pervasive xenophobia, extending beyond a single racial group to other mistrusted minorities or among immigrants more generally through a “secondary transfer effect” (Jolley, Meleady, & Douglas, 2020; Kofta & Sedek, 2005; Swami, 2012).

Component 2: Misinformation Processing

It has been argued that BCT is fundamentally rooted in ignorance or “denialism” (Kalichman, 2014). For example, Sunstein and Vermeule (2009, p. 211) claimed that people who come to believe in conspiracy theories suffer from “crippled epistemologies” whereby they “know very few things, and what they know is wrong.” The two-component, socio-epistemic model instead contends that BCT is a more active process whereby loss of trust in traditional institutions of authority results in an epistemic vacuum that can send individuals “down the rabbit hole” in search of alternative explanations. What remains to be explained is why, when individuals go looking for answers, some gravitate towards conspiracy theories. As noted previously, general conspiracist ideation by itself offers an explanation too close to tautology. When epistemic mistrust is instead recognized as a crucial first component of BCT, conspiracy theories are better understood as direct counter-narratives stemming from contested epistemic authority (Harambam & Aupers, 2015). In that sense, conspiracy theories are appealing to those with epistemic mistrust because they represent the antithesis of authoritative accounts.

The two-component, socio-epistemic model of BCT further highlights that conspiracy theories are something of a misnomer. In contrast to delusions, conspiracy theories are typically shared beliefs that lack a self-referential component (Pierre, 2020). Indeed, when individuals who mistrust authoritative accounts search for alternative answers, conspiracy theories do not arise de novo; they are already “out there,” lying in wait. In this sense, BCT often does not involve “theorizing” so much as sifting through information, deciding what to believe and what to disregard. Although some “conspiracy theorists” may be genuinely theorizing, most are crafting a narrative based on the synthesis of available information (Landrum, Olshansky, & Richards, 2019; Raab, Ortlieb, et al., 2013) and might be more appropriately described as “conspiracy theists.”

Modeling conspiracy theories along these lines, it is proposed that the second component of BCT involves vulnerability to misinformation and a biased search for narratives that counter epistemic authority. Although some authors have characterized conspiracy theorists as “naïve scientists” (Kareklas, Muehling, & Weber, 2015) or citizen scientists (Goldenberg, 2016) looking to fill “knowledge gaps” (Brownlie & Howson, 2005), empiric scientific research should not be conflated with searching for answers on the internet where misinformation and “fake news” abound. Although there is disagreement about the extent to which gullibility contributes to BCT (Douglas, Sutton, & Cichocka, 2019), the “gullibility conspiracism hypothesis” is supported by an association between BCT and other implausible beliefs, bullshit receptivity, and cognitive biases such as conjunction fallacies, illusory pattern perception, and hypersensitive agency detection (van Prooijen, 2019). Like BCT, belief in fake news has also been found to be associated with delusion-proneness, bullshit receptivity, and reduced analytic thinking (Bronstein et al., 2019; Pennycook & Rand, 2019, 2020). These findings support the view that mistrust results in an epistemic vacuum that is easily filled by misinformation.

Specific information processing biases may help to explain the diversity and specificity of BCT. For example, recent research has applied the “differential susceptibility to media effects model” (Valkenberg & Peter, 2013) to explore why some individuals are susceptible to “viral deception” and misinformation about science-related conspiracy theories (Landrum, Olshansky, & Richards, 2019). In a series of studies, susceptibility to misinformation about a flat Earth and other scientific conspiracy theories was found to be associated with not only high levels of conspiracist ideation, but also low scientific literacy and in some instances, religiosity (Landrum & Olshansky, 2019; Landrum, Olshansky, & Richards, 2019). In a similar fashion, it is recognized that some conspiracy narratives may be appealing because they follow existing patterns of cultural mistrust as well as racial prejudices and “othering” behaviors within an intergroup context, as discussed previously. According to this perspective, antisemitic conspiracy theories are common because antisemitism is common, with Jewish conspiracy stereotypes an unfortunate part of a “collective heritage of the past” (Kofta & Sedek, 2005) that are utilized as a longstanding “universal explanatory device” (Kofta, Soral, & Bilewicz, 2020).

Exposure to misinformation in the form of conspiracy theories can strengthen BCT through a variety of mechanisms. For example, it has been demonstrated that repeated exposure to misinformation increases belief conviction based on the “illusory truth effect” (Dechêne et al., 2010; Hasher, Goldstein, & Toppino, 1977; Pennycook, Cannon, & Rand, 2018), even when preexposure knowledge is accurate (Fazio et al., 2015). In addition, Raab, Auer, et al. (2013) reported evidence that exposure to extreme or absurd statements contained in conspiracy theories can shift the window of what is regarded as plausible and reduce the salience of official accounts. Several studies have also reported that exposure to conspiracy theories and even discourse about conspiracy theories independent of endorsement can lead to greater mistrust of official views (Einstein & Glick, 2015; Imhoff, Lamberty, & Klein, 2018), suggesting a reciprocal relationship between mistrust and BCT.

Collectively, these findings support the premise that a variety of factors related to individual and social psychology can account for specific conspiracy theory beliefs by impacting the appeal of and vulnerability to specific types of misinformation. Within this socio-epistemic framework, it is therefore necessary to integrate the psychological “problem of misinformation in the head” with the science of misinformation processing as it relates to the “problem of misinformation in the world, where inconsistent information exists across individuals, cultures, and societies” (Seifert, 2017, p. 399).

Conspiracy Theories as Online Misinformation

Although conspiracy theories were common long before the internet with limited evidence to support that BCT has since become more prevalent (Douglas, Uscinski, et al., 2019), there is little doubt that the internet represents a “petri dish” where conspiracy theories can flourish. Additionally, whereas it was relatively easy to distinguish between reliable information and misinformation in the era of print journalism when tabloids were relegated to the grocery store check-out line, reliable and fact-checked information now co-exists online alongside biased editorial opinions and tabloid journalism, all provided free of charge. The hope of this online democratization of knowledge was that the truth would “rise to the top,” but that has not occurred, and many consumers lack the skills or knowledge to determine the reliability of informational sources (Benegal, 2018; Kareklas, Muehling, & Weber, 2015; Lewandowsky, Ecker, & Cook, 2017; Raab, Ortlieb, et al., 2013).

A 2007 study found that belief in 9/11 conspiracy theories was significantly associated with being a consumer of grocery store tabloids and online blogs as opposed to more “legitimate media sources” (Stempel, Hargrove, & Stempel, 2007). Recent research has clarified that when individuals who are looking for answers online gravitate towards conspiracy theories, it is often an active process of self-selection rather than a passive effect (Klein, Clutton, & Dunn, 2019). Consistent with the two-component, socio-epistemic model of BCT, this active search for conspiracy narratives is steered by individual biases related to epistemic mistrust along with other factors such as need for certainty and closure, intergroup prejudices, and lack of analytic thinking. In the presence of such preexisting intuitions and beliefs, confirmation bias appears to figure heavily into the search for “evidence” to support conspiracy theories (van Prooijen & Douglas, 2018).

At the same time however, it has been demonstrated that BCT is more likely when conspiracy theories are “made salient” through exposure, especially among those with a high need for cognitive closure (Marchlewska, Cichocka, & Kossowska, 2018). It has likewise been shown that online misinformation and false or “fake” news tends to travel “farther, faster, deeper, and more broadly” than truth (Vosoughi, Roy, & Aral, 2018, p. 1147). Conspiracy theories in particular seem to offer novel and provocative online content that elicits higher than average engagement on platforms like YouTube where related content is viewed in turn due to automated recommendations (Faddoul, Chaslot, & Farid, 2020). These findings validate the view that conspiracy theories can be understood as self-perpetuating internet “memes” (Pierre, 2019; Varis, 2019). It therefore appears that BCT is not only related to searches for information biased by individual consumer factors, but that those searches are further biased by the availability and salience of conspiracy theories within the architecture of the internet. A modern understanding of how misinformation processing contributes to BCT therefore requires an evolving appreciation of digital forces such as “echo chambers” and “filter bubbles” (Bakshy, Messing, & Adamic, 2015; Del Vicario et al., 2016; Flaxman et al., 2016) that can result in a kind of “confirmation bias on steroids” (Pierre, 2019).

Misinformation Processing in a Post-Truth World

According to the “source credibility model,” the trustworthiness of information depends on the perceived expertise and trustworthiness of its communicator (Kareklas, Muehling, & Weber, 2015). But since trust serves as a “heuristic for competence” (Benegal, 2018), evaluations of expertise and trustworthiness are often one and the same such that without trust, there are no acknowledged experts. Nichols (2014; Nichols & Smith, 2017) has characterized the modern democratization of knowledge that has been facilitated by the internet, whereby uninformed opinions are treated with false equivalence to voices of epistemic authority, as rooted in mistrust that has resulted in the “death of expertise.”

Recently, it has been suggested that “conspiracy” and “theory” have become decoupled within a “new conspiracism” in which conspiracy theories are not so much attempts to explain anything (Rosenblum & Muirhead, 2019) as they are deceptions intent on creating a “post-truth world” (Lewandowsky, Ecker, & Cook, 2017). According to this worldview, “facts no longer matter or are not even acknowledged to exist” and have been replaced by an “alternative epistemology that does not conform to conventional standards of evidentiary support” (Lewandowsky, Ecker, & Cook, 2017, p. 356, 361). When epistemic mistrust becomes so pervasive as to reject all informational sources and abandon the very concept of objective truth, BCT more closely resembles the monological belief system extreme where logically contradictory conspiracy theories can coexist.

Numerous authors have highlighted the role of mass media and the internet in creating a world where it can be a significant challenge to distinguish between reliable information and misinformation (Benegal, 2018; Kareklas, Muehling, & Weber, 2015; Lewandowsky, Ecker, & Cook, 2017; Raab, Ortlieb, et al., 2013), with users who are “prosumers” that simultaneously consume and produce information (Rosselli, Martini, & Bragazzi, 2016). Some conspiracy theories are started or fueled by online “trolls” and “conspiracy theory entrepreneurs” who may have low degrees of actual belief conviction themselves (Bessi et al., 2015; Sunstein & Vermeule, 2009), but are motivated to “erode trust in facts and reality” (Lewandowsky, Ecker, & Cook, 2017, p. 361), whether to sow chaos for its own sake or for profit and political propagandizing. As noted previously, the propagation of conspiracy theories has become a tool of populist movements with vested interests in discrediting “elites” (Bergmann, 2018; Castanho Silva, Vegetti, & Littvay, 2017; Rosenblum & Muirhead, 2019) while foreign governments have weaponized online mechanisms to promote conspiracy theories as a form of cyberterrorism intended to foment discord with democracy (Broniatowski et al., 2018).

Integration and Implications

As with other types of delusion-like beliefs, a broader understanding of conspiracy theories may be achieved through the integration of multiple perspectives straddling the psychological, sociological, information, and political sciences (Pierre, 2019). The two-component, socio-epistemic model of BCT proposed here draws from these perspectives in order to account for the high rate of occurrence of conspiracy theories in the general population, their variance along a continuum of belief conviction, and the inconsistency of research findings attempting to understand them as a kind of individual psychopathology according to the psychological deficit model. In doing so, it provides an overarching framework that integrates both individual psychological and interactive sociocultural factors with implications for mitigation. Figure 1 schematizes a proposed relationship between epistemic mistrust, misinformation processing, and other interactional factors that vary across different conspiracy theories and social circumstances.

Click to enlarge
jspp.v8i2.1362-f1
Figure 1

The two-component, socio-epistemic model of belief in conspiracy theories.

The utility of the two-component, socio-epistemic model of BCT is well-illustrated by considering the case example of vaccine-related conspiracy theories. The beliefs of so-called “anti-vaxxers” are varied, but at their core involve the conviction that government agencies worldwide promote and mandate vaccination despite awareness of significant vaccine harms such as causing autism. Several authors have proposed that conspiracy theories related to vaccines, ranging from the more general phenomenon of “vaccine hesitancy” to more extreme conspiracy theories, are best understood as expressions of epistemic mistrust (Goldenberg, 2016; Larson, 2018). While vaccine-related conspiracy theories have been traditionally attributed to an uninformed public according to the “knowledge deficit model,” Goldenberg has noted that they first emerged at a time of simmering public mistrust in the UK’s handling of the bovine spongiform encephalopathy outbreak of the late 1980s (Goldenberg, 2016, in press). Along this pre-existing backdrop, a 1998 case series purporting a link between the trivalent measles, mumps, and rubella (MMR) vaccine and autism was published in a well-respected medical journal, was promoted by an academic institution, and took over a decade to be discredited and fully retracted from publication. To date, belief in a link between the MMR vaccine and autism persists not for lack of exposure to corrective information or the public’s poor understanding of science, but because parents refuse to take the leaps of faith in trusting the medical community and scientific institutions necessary to allay their vaccines fears (Goldenberg, 2016, 2019, in press). Larson et al. (2018) have noted that such mistrust exists at multiple layers including in relation to vaccines and their makers, healthcare professionals, and the public healthcare systems.

With epistemic mistrust of authoritative sources attributed to perceived trust violations and corruption, attempts to dispel myths about vaccines by providing factual counterinformation have not been found to be a particularly effective strategy under experimental conditions, especially in terms of increasing actual intentions to vaccinate where there may be a backfire effect (Jolley & Douglas, 2014b; Nyhan et al., 2014; Nyhan & Reifler, 2015). Vaccine-related conspiracy theories likewise persist in the real world despite subsequent studies repeatedly demonstrating no evidence of any increased risk of autism with vaccination (Hviid et al., 2019; Jain et al., 2015; Taylor, Swerdfeger, & Eslick, 2014). Remarkably, those who believe in vaccine-related conspiracy theories reject such evidence in favor of the placing trust in the fraudulent claims of a lone physician whose original paper was retracted due to falsified data and financial conflicts of interest such that he lost his medical license (Deer, 2011).

BCT related to vaccines has been further bolstered by widely available online misinformation such that it has been claimed that we are currently living in a “golden age of anti-vaccine conspiracies” that are “endemic among anti-vaccination groups” (Stein, 2017, p. 168). A 2010 study found that 71% of top 10 “hits” from Google searches using the keyword “vaccination” linked to anti-vaccination sites (Kata, 2010). Although measures have since been implemented by social media platforms to deprioritize such misinformation appearing in online searches, conspiracy theories related to vaccines and other topics continue to flourish nonetheless (Elkin, Pullon, & Stubbe, 2020; Faddoul, Chaslot, & Farid, 2020). A wide variety of internet and social media sites promote pseudo-scientific tropes about vaccines, elevate anecdotal accounts of harm over objective research, and breed mistrust in authoritative sources of vaccine-related information (Evrony & Caplan, 2017; Kata, 2012). As of 2019, anti-vaccine groups on Facebook outnumbered pro-vaccine groups by more than 2:1 and were growing much faster through interaction with other groups (Johnson et al., 2020). In addition, vaccine misinformation is often either generated or recirculated by internet “bots” and Russian “trolls” with an apparent goal of amplifying debate and further undermining authoritative accounts (Broniatowski et al., 2018; Walter, Ophir, & Jamieson, 2020). The net result is that conspiracy theories about vaccines are ubiquitous on the internet, especially for those looking to fill the informational vacuum created by epistemic mistrust.

Exposure to both explicit conspiracy theories and implicit conspiracy cues have been linked to BCT as well as anti-vaccination attitudes including reduced intentions to vaccinate (Hornsey, Harris, & Fielding, 2018; Jolley & Douglas, 2014b; Lyons, Merola, & Reifler, 2019) such that we are now witnessing the re-emergence of diseases like measles that were previously eradicated in some parts of the world. Vaccine-related conspiracy theories therefore provide a clear example of how epistemic mistrust and exposure to misinformation work synergistically to create BCT, with important practical consequences that underscore the need for evidence-based mitigation strategies. According to the two-component, socio-epistemic model of BCT, interventions must target both mistrust and misinformation.

With respect to reducing epistemic mistrust, it has been argued that trustworthiness must come before trust (O’Neill, 2020) such that mitigating BCT related to vaccines requires that institutions of authority take steps to regain their “social capital” (Lewandowsky, Cook, & Ecker, 2017; Ozawa, Paina, & Qiu, 2016) as “trust builders” (Brownlie & Howson, 2005) before attempting to dispel misinformation. Some evidence suggests that this could be facilitated by institutions of authority being more transparent in disclosing areas of epistemic uncertainty and ceding power to the public by inviting it “in the door” to participate interactively (Goldenberg, in press; Stilgoe, 2016) and by fostering “political efficacy” (Parsons et al., 1999). Others have suggested that the most effective interventions to reduce vaccine hesitancy may depend on one-on-one engagement, listening, and dialogue between healthcare providers and their patients (Goldstein et al., 2015; Larson, 2013, 2018). Gust et al. (2008) found that parents who resolved their vaccine hesitancy by changing their minds in favor of vaccination most frequently did so due to information or assurances from their healthcare provider. A meta-analysis of strategies to address vaccine hesitancy likewise found the most robust support for dialogue-based interventions including the use of social media (Jarrett et al., 2015). Of relevance to such dialogues, it appears that many with BCT find it objectionable to be categorized as “conspiracy theorists” (Wood & Douglas, 2013), suggesting that socially-stigmatizing labels like “anti-vaxxers” or “flat-Earthers” might have the unintended effect of causing individuals with BCT to dig in their heels more deeply, increasing conviction in order to defend against perceived attacks on one’s identity (Wood, 2016).

In terms of correcting misinformation, education about the dangers of infectious disease was found to be effective in reducing vaccine hesitancy in one experimental study (Horne et al., 2015), but the effect was limited to “fence-sitters” who were neither strongly for or against vaccines (Betsch, Korn, & Holtmann, 2015). This suggests that the effectiveness of mitigation strategies may vary according to belief conviction, supporting anecdotal observations that “full blown” BCT, like delusional beliefs, might be more resistant to efforts to either educate or to win back trust. To date, the few studies of mitigation strategies against other types of BCT suggest that “inoculation strategies” that offer pre-emptive warnings about exposure to misinformation and conspiracy theories are among the most evidence-based interventions (Banas & Miller, 2013; Jolley & Douglas, 2017). But like vaccines themselves, the effectiveness of inoculation strategies requires that the intervention “beat misinformation to the punch.” In addition, reliable information and inoculation warnings can be thwarted by counter-inoculation strategies (e.g. inoculations against inoculations) such that efforts to change “hearts and minds” run the risk of being reduced to recursive battles over epistemic trust, with one side endlessly arguing that the other is an unreliable source of information.

Conclusion

In summary, the two-component, socio-epistemic model of BCT offers a potentially normalizing account of conspiracist ideation based on a reciprocal relationship between mistrust and belief in misinformation. Unlike paranoia, epistemic mistrust can be more broadly understood as a psychosocial phenomenon with both individual cognitive and interactional sociocultural determinants that can be both highly specific as well as potentially appropriate to a situation. In order to gain a broader understanding of conspiracy theories, future research exploring the underpinnings of BCT would benefit from consistently measuring epistemic mistrust as a key variable (Larson et al., 2018) while devoting greater attention to the social antecedents of fractured trust relationships and potentially racist origins of intergroup conspiracy theories. While epistemic mistrust and needs for closure or certainty may lead to searches for information that are biased in favor of Manichean narratives reflecting general conspiracist ideation, this preference must be understood in the context of online information processing. Conspiracy theory research would therefore benefit from continuing movement out of the psychology lab into the online spaces where BCT arises.

Although the two-component, socio-epistemic model emphasizes a normalizing account of BCT, it acknowledges the harm that can always accompany belief in misinformation. Accordingly, it emphasizes that effective mitigation requires attention on the part of institutions of epistemic authority to cultivate trust, taking care not to put the “cart” of trust before the “horse” of trustworthiness. This approach mirrors that of cognitive behavioral therapy which requires a therapeutic alliance in order to modify cognitive distortions and misbeliefs by collaboratively analyzing evidence (Cameron, Rodgers, & Dagnan, 2018). Attempting to modify BCT within individuals should begin with questions that ask, “Who do you trust or mistrust and why?” and “How do you decide what to believe?” However, given the ubiquity of conspiracy theories and the fact that most individuals with BCT are not “help-seeking,” effective mitigation strategies may necessitate wholescale approaches that 1) confer resistance against BCT by utilizing inoculation strategies that counter misinformation where it occurs (e.g. online), 2) teach analytic thinking within educational systems at an early age (Marsh & Yang, 2017; Swami et al., 2014), and 3) restructure or otherwise impose restrictions on the digital architectures that distribute information in order to label or curb misinformation and promote “technocognition” (Lewandowsky, Ecker, & Cook, 2017). These are daunting challenges that warrant considerable resources to support interventional research going forward.

Funding

The author has no funding to report.

Competing Interests

The author has declared that no competing interests exist.

Acknowledgments

The author has no support to report.

References

  • Abalakina-Paap, M., Stephan, W. G., Craig, T., & Gregory, W. L. (1999). Beliefs in conspiracy theories. Political Psychology, 20, 637-647. https://doi.org/10.1111/0162-895X.00160

  • Ball, K., Lawson, W., & Alim, T. (2013). Medical mistrust, conspiracy beliefs and HIV-related behavior among African Americans. Journal of Psychology and Behavioral Science, 1, 1-7. Retrieved from http://jpbsnet.com/vol-1-no-1-december-2013-abstract-1-jpbs

  • Banas, J. A., & Miller, G. (2013). Inducing resistance to conspiracy theory propaganda: Testing inoculation and metainoculation strategies. Human Communication Research, 39, 184-207. https://doi.org/10.1111/hcre.12000

  • Barnby, J. M., Bell, V., Rains, L. S., Mehta, M. A., & Deeley, Q. (2019). Beliefs are multidimensional and vary in stability over time – Psychometric properties of the Beliefs and Values Inventory (BVI). PeerJ, 7, Article e6819. https://doi.org/10.7717/peerj.6819

  • Basham, L. (2001). Living with the conspiracy. The Philosophical Forum, 3, 265-280. https://doi.org/10.1111/0031-806X.00065

  • Bakshy, E., Messing, S., & Adamic, L. A. (2015). Exposure to ideologically diverse news and opinion on Facebook. Science, 348, 1130-1132. https://doi.org/10.1126/science.aaa1160

  • Benegal, S. (2018). Overconfidence and the discounting of expertise: A commentary. Social Science & Medicine, 213, 95-97. https://doi.org/10.1016/j.socscimed.2018.07.039

  • Bergmann, E. (2018). Conspiracy & populism: The politics of misinformation. Cham, Switzerland: Palgrave Macmillan. https://doi.org/10.1007/978-3-319-90359-0

  • Bessi, A., Coletto, M., Davidescu, G. A., Scala, A., Caldarelli, G., & Quatrociochhi, W. (2015). Science vs conspiracy: Collective narratives in the Age of Misinformation. PLoS One, 10(2), Article e0118093. https://doi.org/10.1371/journal.pone.0118093

  • Betsch, C., Korn, L., & Holtmann, C. (2015). Don’t try to convert the antivaccinators, instead target the fence-sitters. Proceedings of the National Academy of Sciences of the United States of America, 112, E6725-E6726. https://doi.org/10.1073/pnas.1516350112

  • Bogart, L. M., & Thorburn, S. (2005). Are HIV/AIDS conspiracy beliefs a barrier to HIV prevention among African Americans? Journal of Acquired Immune Deficiency Syndrome, 38, 213-218. https://doi.org/10.1097/00126334-200502010-00014

  • Bogart, L. M., Wagner, G., Galvan, F. H., & Banks, D. (2010). Conspiracy beliefs about HIV are related to antiretroviral treatment nonadherence among African American men with HIV. Journal of Acquired Immune Deficiency Syndromes, 53, 648-655. https://doi.org/10.1097/QAI.0b013e3181c57dbc

  • Bogart, L. M., Wagner, G. J., Green, H. D., Mutchler, M. G., Klein, D. J., McDavitt, B., . . . Hilliard, C. L., (2016). Medical mistrust among social network members may contribute to antiretroviral treatment nonadherence in African Americans living with HIV. Social Science & Medicine, 164, 133-140. https://doi.org/10.1016/j.socscimed.2016.03.028

  • Broniatowski, D. A., Jamison, A. M., Qi, S., AlKulaib, L., Chen, T., Benton, A., . . . Dredze, M., (2018). Weaponized health communication: Twitter bots and Russian trolls amplify the vaccine debate. American Journal of Public Health, 108, 1378-1384. https://doi.org/10.2105/AJPH.2018.304567

  • Bronstein, M. V., Pennycook, G., Bear, A., Rand, D. G., & Cannon, T. D. (2019). Belief in fake news is associated with delusionality, dogmatism, religious fundamentalism, and reduced analytic thinking. Journal of Applied Research in Memory and Cognition, 8, 108-117. https://doi.org/10.1016/j.jarmac.2018.09.005

  • Brooks, R. A., Allen, V. C., Jr., Regan, R., Mutchler, M. G., Cervantes-Tadeo, R., & Lee, S.-J. (2018). HIV/AIDS conspiracy beliefs and intention to adopt pre-exposure prophylaxis among black men who have sex with men in Los Angeles. International Journal of STD & AIDS, 29, 375-381. https://doi.org/10.1177/0956462417727691

  • Brotherton, R., & French, C. C. (2014). Belief in conspiracy theories and susceptibility to the conjunction fallacy. Applied Cognitive Psychology, 28, 238-248. https://doi.org/10.1002/acp.2995

  • Brotherton, R., French, C. C., & Pickering, A. D. (2013). Measuring belief in conspiracy theories: The generic conspiracist beliefs scale. Frontiers in Psychology, 4, Article 279. https://doi.org/10.3389/fpsyg.2013.00279

  • Brownlie, J., & Howson, A. (2005). ‘Leaps of faith’ and MMR: An empirical study of trust. Sociology, 39, 221-239. https://doi.org/10.1177/0038038505050536

  • Cai, H., Zou, X., Feng, Y., Liu, Y., & Jing, Y. (2018). Increasing need for uniqueness in contemporary China: Empirical evidence. Frontiers in Psychology, 9, Article 554. https://doi.org/10.3389/fpsyg.2018.00554

  • Cameron, S. K., Rodgers, J., & Dagnan, D. (2018). The relationship between the therapeutic alliance and clinical outcomes in cognitive behaviour therapy for adults with depression: A meta-analytic review. Clinical Psychology & Psychotherapy, 25, 446-456. https://doi.org/10.1002/cpp.2180

  • Castanho Silva, B., Vegetti, F., & Littvay, L. (2017). The elite is up to something: Exploring the relation between populism and belief in conspiracy theories. Schweizerische Zeitschrift für Politikwissenschaft, 23, 423-443. https://doi.org/10.1111/spsr.12270

  • Chayinska, M., & Minescu, A. (2018). “They’ve conspired against us”: Understanding the role of social identification and conspiracy beliefs in justification of ingroup collective behavior. European Journal of Social Psychology, 48, 990-998. https://doi.org/10.1002/ejsp.2511

  • Cichocka, A., Marchlewska, M., Golec de Zavala, A., & Olechowksi, M. (2016). “They will not control us”: Ingroup positivity and belief in intergroup conspiracies. British Journal of Psychology, 107, 556-576. https://doi.org/10.1111/bjop.12158

  • Clark, A., Mayben, J. K., Hartman, C., Kallen, M. A., & Giordano, T. P. (2008). Conspiracy beliefs about HIV infection are common but not associated with delayed diagnosis or adherence to care. AIDS Patient Care and STDs, 22, 753-759. https://doi.org/10.1089/apc.2007.0249

  • Connors, M. H., & Halligan, P. W. (2015). A cognitive account of belief: A tentative road map. Frontiers in Psychology, 5, Article 1588. https://doi.org/10.3389/fpsyg.2014.01588

  • Corlett, P. R. (2019). Factor one, familiarity and frontal cortex: A challenge to the two-factor theory of delusions. Cognitive Neuropsychiatry, 24, 165-177. https://doi.org/10.1080/13546805.2019.1606706

  • Corlett, P. R., Taylor, J. R., Wang, X.-J., Fletcher, P. C., & Krystal, J. H. (2010). Toward a neurobiology of delusions. Progress in Neurobiology, 92, 345-369. https://doi.org/10.1016/j.pneurobio.2010.06.007

  • Dagnall, N., Drinkwater, K., Parker, A., Denovan, A., & Parton, M. (2015). Conspiracy theory and cognitive style: A worldview. Frontiers in Psychology, 6, Article 206. https://doi.org/10.3389/fpsyg.2015.00206

  • Dale, S. K., Bogart, L. M., Wagner, G. J., Galvan, F. H., & Klein, D. J. (2016). Medical mistrust is related to lower longitudinal medication adherence among African-American males with HIV. Journal of Health Psychology, 21, 1311-1321. https://doi.org/10.1177/1359105314551950

  • Darwin, H., Neave, N., & Holmes, J. (2011). Belief in conspiracy theories: The role of paranormal belief, paranoid ideation, and schizotypy. Personality and Individual Differences, 50, 1289-1293. https://doi.org/10.1016/j.paid.2011.02.027

  • Davies, M., Coltheart, M., Langdon, R., & Breen, N. (2001). Monothematic delusions: Towards a two-factor account. Philosophy, Psychiatry, & Psychology, 8, 133-158. https://doi.org/10.1353/ppp.2001.0007

  • Davis, J., Wetherell, G., & Henry, P. J. (2018). Social devaluation of African Americans and race-related conspiracy theories. European Journal of Social Psychology, 48, 999-1010. https://doi.org/10.1002/ejsp.2531

  • Dechêne, A., Stahl, C., Hansen, J., & Wänke, M. (2010). The truth about the truth: A meta-analytic review of the truth effect. Personality and Social Psychology Review, 14, 238-257. https://doi.org/10.1177/1088868309352251

  • Deer, B. (2011). How the vaccine crisis was meant to make money. BMJ, 342, Article c5258. https://doi.org/10.1136/bmj.c5258

  • Del Vicario, M., Bessi, A., Zollo, F., Petroni, F., Scala, A., Caldarelli, G., . . . Quattrociocchi, W., (2016). The spreading of misinformation online. Proceedings of the National Academy of Sciences of the United States of America, 113, 554-559. https://doi.org/10.1073/pnas.1517441113

  • Douglas, K. M., Sutton, R. M., Callan, M. J., Dawtry, R. J., & Harvey, A. J. (2016). Someone is pulling the strings: Hypersensitive agency detection and belief in conspiracy theories. Thinking & Reasoning, 22, 57-77. https://doi.org/10.1080/13546783.2015.1051586

  • Douglas, K. M., Sutton, R. M., & Cichocka, A. (2019). Belief in conspiracy theories: Looking beyond gullibility. In J. P. Forgas & R. F. Baumeister (Eds.), The social psychology of gullibility: Fake news, conspiracy theories, and irrational beliefs (pp. 61-76). Abingdon, United Kingdom: Routledge.

  • Douglas, K. M., Uscinski, J. E., Sutton, R. M., Cichocka, A., Nefes, T., Ang, C. S., & Deravi, F. (2019). Understanding conspiracy theories. Political Psychology, 40, Suppl 13-35. https://doi.org/10.1111/pops.12568

  • Einstein, K. L., & Glick, D. M. (2015). Do I think BLS data as BS? The consequences of conspiracy theories. Political Behavior, 37, 679-701. https://doi.org/10.1007/s11109-014-9287-z

  • Elkin, L. E., Pullon, S. R. H., & Stubbe, M. H. (2020). “Should I vaccinate my child?” Comparing the displayed stances of vaccine information retrieved from Google, Facebook and YouTube. Vaccine, 38, 2771-2778. https://doi.org/10.1016/j.vaccine.2020.02.041

  • Enders, A. M., & Smallpage, S. M. (2018). On the measurement of conspiracy beliefs. Research & Politics, 5(1), 1-4. https://doi.org/10.1177/2053168018763596

  • Enders, A. M., Smallpage, S. M., & Lupton, R. N. (2018). Are all ‘birthers’ conspiracy theorists? On the relationship between conspiratorial thinking and political orientations. British Journal of Political Science, 50(3), 849-866. https://doi.org/10.1017/S0007123417000837

  • Evrony, A., & Caplan, A. (2017). The overlooked dangers of anti-vaccinations groups’ social media presence. Human Vaccines & Immunotherapeutics, 13, 1475-1476. https://doi.org/10.1080/21645515.2017.1283467

  • Faddoul, M., Chaslot, G., & Farid, H. (2020). A longitudinal analysis of YouTube’s promotion of conspiracy videos. arXiv:2003.03318

  • Fazio, L. K., Brashier, N. M., Payne, B. K., & Marsh, E. J. (2015). Knowledge does not protect against illusory truth. Journal of Experimental Psychology: General, 144, 993-1002. https://doi.org/10.1037/xge0000098

  • Federico, C. M., Williams, A. L., & Vitriol, J. A. (2018). The role of system identity threat in conspiracy theory endorsement. European Journal of Social Psychology, 48, 927-938. https://doi.org/10.1002/ejsp.2495

  • Flaxman, S., Goel, S., & Rao, J. M. (2016). Filter bubbles, echo chambers, and online news consumption. Public Opinion Quarterly, 80, 298-320. https://doi.org/10.1093/poq/nfw006

  • Franks, B., Bangerter, A., & Bauer, M. W. (2013). Conspiracy theories as quasi-religious mentality: An integrated account from cognitive science, social representations theory, and frame theory. Frontiers in Psychology, 4, Article 424. https://doi.org/10.3389/fpsyg.2013.00424

  • Franks, B., Bangerter, A., Bauer, M. W., Hall, M., & Noort, M. C. (2017). Beyond “monologicality”? Exploring conspiracist worldviews. Frontiers in Psychology, 8, Article 861. https://doi.org/10.3389/fpsyg.2017.00861

  • Goertzel, T. (1994). Belief in conspiracy theories. Political Psychology, 15, 731-742. https://doi.org/10.2307/3791630

  • Goldenberg, M. J. (2016). Public misunderstanding of science? Reframing the problem of vaccine hesitancy. Perspectives on Science, 24, 552-581. https://doi.org/10.1162/POSC_a_00223

  • Goldenberg, M. J. (2019). Vaccines, values and science. Canadian Medical Association Journal, 191, E397-E398. https://doi.org/10.1503/cmaj.181635

  • Goldenberg, M. J. (in press). An intractable problem? Public trust and vaccine hesitancy. Pittsburg, PA, USA: University of Pittsburgh Press.

  • Goldstein, S., MacDonald, N. E., & Guirguis, S.SAGE Working Group on Vaccine Hesitancy. (2015). Health communication and vaccine hesitancy. Vaccine, 33, 4212-4214. https://doi.org/10.1016/j.vaccine.2015.04.042

  • Golec de Zavala, A., & Federico, C. M. (2018). Collective narcissism and growth of conspiracy thinking over the course of the 2016 United States presidential election: A longitudinal analysis. European Journal of Social Psychology, 48, 1011-1018. https://doi.org/10.1002/ejsp.2496

  • Goreis, A., & Voracek, M. (2019). A systematic review and meta-analysis of psychological research on conspiracy beliefs: Field characteristics, measurement instruments, and associations with personality traits. Frontiers in Psychology, 10, Article 205. https://doi.org/10.3389/fpsyg.2019.00205

  • Green, R., & Douglas, K. M. (2018). Anxious attachment and belief in conspiracy theories. Personality and Individual Differences, 125, 30-37. https://doi.org/10.1016/j.paid.2017.12.023

  • Grzesiak-Feldman, M., & Ejsmont, A. (2008). Paranoia and conspiracy thinking of Jews, Arabs, Germans, and Russians in a Polish Sample. Psychological Reports, 102, 884-886. https://doi.org/10.2466/pr0.102.3.884-886

  • Gust, D. A., Darling, N., Kennedy, A., & Schwartz, B. (2008). Parents with doubts about vaccines: Which vaccines and reasons why. Pediatrics, 122, 718-725. https://doi.org/10.1542/peds.2007-0538

  • Harambam, J., & Aupers, S. (2015). Contesting epistemic authority: Conspiracy theories on the boundaries of science. Public Understanding of Science, 24, 466-480. https://doi.org/10.1177/0963662514559891

  • Hardwig, J. (1991). The role of trust in knowledge. Journal of Philosophy, 88, 693-708. https://doi.org/10.2307/2027007

  • Hart, J., & Graether, M. (2018). Something’s going on here: Psychological predictors of belief in conspiracy theories. Journal of Individual Differences, 39, 229-237. https://doi.org/10.1027/1614-0001/a000268

  • Hasher, L., Goldstein, D., & Toppino, T. (1977). Frequency and the conference of referential validity. Journal of Verbal Learning and Verbal Behavior, 16, 107-112. https://doi.org/10.1016/S0022-5371(77)80012-1

  • Heller, J. (2015). Rumors and realities: Making sense of HIV/AIDS conspiracy narratives and contemporary legends. American Journal of Public Health, 105, e43-e50. https://doi.org/10.2105/AJPH.2014.302284

  • Hofstadter, R. (1964, November). The paranoid style in American politics. Harper’s Magazine, 77-86. https://harpers.org/archive/1964/11/the-paranoid-style-in-american-politics

  • Horne, Z., Powell, D., Hummel, J. E., & Holyoak, K. J. (2015). Countering antivaccination attitudes. Proceedings of the National Academy of Sciences of the United States of America, 112, 10321-10324. https://doi.org/10.1073/pnas.1504019112

  • Hornsey, M. J., Harris, E. A., & Fielding, K. S. (2018). The psychological roots of anti-vaccination attitudes: A 24-national investigation. Health Psychology, 37, 307-315. https://doi.org/10.1037/hea0000586

  • Hviid, A., Hansen, J. V., Frisch, M., & Melbye, M. (2019). Measles, mumps, rubella vaccination and autism: A national cohort study. Annals of Internal Medicine, 170, 513-520. https://doi.org/10.7326/M18-2101

  • Imhoff, R., & Bruder, M. (2014). Speaking (un-)truth to power: Conspiracy mentality as a generalised political attitude. European Journal of Personality, 28, 25-43. https://doi.org/10.1002/per.1930

  • Imhoff, R., & Lamberty, P. K. (2017). Too special to be duped: Need for uniqueness motivates conspiracy beliefs. European Journal of Social Psychology, 47, 724-734. https://doi.org/10.1002/ejsp.2265

  • Imhoff, R., & Lamberty, P. K. (2018). How paranoid are conspiracy believers? Toward a more fine-grained understanding of the connect and disconnect between paranoia and belief in conspiracy theories. European Journal of Social Psychology, 48, 909-926. https://doi.org/10.1002/ejsp.2494

  • Imhoff, R., Lamberty, P., & Klein, O. (2018). Using power as a negative cue: How conspiracy mentality affects epistemic trust in sources of historical knowledge. Personality and Social Psychology Bulletin, 44, 1364-1379. https://doi.org/10.1177/0146167218768779

  • Jain, A., Marshall, J., Buikema, A., Bancroft, T., Kelly, J. P., & Newschaffer, C. J. (2015). Autism occurrence by MMR vaccine status among US children with older siblings with and without autism. Journal of the American Medical Association, 313, 1534-1540. https://doi.org/10.1001/jama.2015.3077

  • Jarrett, C., Wilson, R., O’Leary, M., Eckersberger, E., & Larson, H. J.SAGE Working Group on Vaccine Hesitancy. (2015). Strategies for addressing vaccine hesitancy – A systematic review. Vaccine, 33, 4180-4190. https://doi.org/10.1016/j.vaccine.2015.04.040

  • Johnson, N. F., Velásquez, N., Restrepo, N. J., Leahy, R., Gabriel, N., El Oud, S., . . . Lupu, Y., (2020). The online competition between pro- and anti-vaccination views. Nature, 582, 230-234. https://doi.org/10.1038/s41586-020-2281-1

  • Jolley, D., & Douglas, K. M. (2014a). The social consequences of conspiracism: Exposure to conspiracy theories decreases intentions to engage in politics and to reduce one’s carbon footprint. British Journal of Psychology, 105, 35-56. https://doi.org/10.1111/bjop.12018

  • Jolley, D., & Douglas, K. M. (2014b). The effects of anti-vaccine conspiracy theories on vaccination intentions. PLoS One, 9(2), Article e89177. https://doi.org/10.1371/journal.pone.0089177

  • Jolley, D., & Douglas, K. M. (2017). Prevention is better than cure: Addressing anti-vaccine conspiracy theories. Journal of Applied Social Psychology, 47, 459-469. https://doi.org/10.1111/jasp.12453

  • Jolley, D., Douglas, K. M., Leite, A. C., & Schrader, T. (2019). Belief in conspiracy theories and intentions to engage in everyday crime. British Journal of Social Psychology, 58, 534-549. https://doi.org/10.1111/bjso.12311

  • Jolley, D., Douglas, K. M., & Sutton, R. M. (2018). Blaming a few bad apples to save a threatened barrel: The system-justifying function of conspiracy theories. Political Psychology, 39, 465-478. https://doi.org/10.1111/pops.12404

  • Jolley, D., Meleady, R., & Douglas, K. M. (2020). Exposure to intergroup conspiracy theories promotes prejudice which spreads across groups. British Journal of Psychology, 111, 17-35. https://doi.org/10.1111/bjop.12385

  • Kalichman, S. C. (2014). The psychology of AIDS denialism: Pseudoscience, conspiracy thinking, and medical mistrust. European Psychologist, 19, 13-22. https://doi.org/10.1027/1016-9040/a000175

  • Kareklas, I., Muehling, D. D., & Weber, T. J. (2015). Reexamining health messages in the Digital Age: A fresh look at source credibility effects. Journal of Advertising, 44, 88-104. https://doi.org/10.1080/00913367.2015.1018461

  • Kata, A. (2010). A postmodern Pandora’s box: Anti-vaccination misinformation on the Internet. Vaccine, 28, 1709-1716. https://doi.org/10.1016/j.vaccine.2009.12.022

  • Kata, A. (2012). Anti-vaccine activists, Web 2.0, and the postmodern paradigm – An overview of tactics and tropes used online by the anti-vaccination movement. Vaccine, 30, 3778-3789. https://doi.org/10.1016/j.vaccine.2011.11.112

  • Klein, C., Clutton, P., & Dunn, A. G. (2019). Pathways to conspiracy: The social and linguistic precursors of involvement in Reddit’s conspiracy theory forum. PLoS One, 14(11), Article e0225098. https://doi.org/10.1371/journal.pone.0225098

  • Klein, C., Clutton, P., & Polito, V. (2018). Topic modeling reveals distinct interests within an online conspiracy forum. Frontiers in Psychology, 9, Article 189. https://doi.org/10.3389/fpsyg.2018.00189

  • Klonoff, E. A., & Landrine, H. (1997). Distrust of Whites, acculturation, and AIDS knowledge among African Americans. Journal of Black Psychology, 23, 50-57. https://doi.org/10.1177/00957984970231005

  • Kofta, M., & Sedek, G. (2005). Conspiracy stereotypes of Jews during systemic transformation in Poland. International Journal of Sociology, 35, 40-64. https://doi.org/10.1080/00207659.2005.11043142

  • Kofta, M., Soral, W., & Bilewicz, M. (2020). What breeds conspiracy antisemitism? The role of political uncontrollability and uncertainty in belief in Jewish conspiracy. Journal of Personality and Social Psychology, 118, 900-918. https://doi.org/10.1037/pspa0000183

  • Kossowska, M., Dragon, P., & Bukowski, M. (2015). When need for closure leads to positive attitudes towards a negatively stereotyped outgroup. Motivation and Emotion, 39, 88-98. https://doi.org/10.1007/s11031-014-9414-5

  • Krouwel, A., Kutiyski, Y., van Prooijen, J.-W., Martinsson, J., & Markstedt, E. (2017). Does extreme political ideology predict conspiracy beliefs, economic evaluations and political trust? Evidence from Sweden. Journal of Social and Political Psychology, 5, 435-462. https://doi.org/10.5964/jspp.v5i2.745

  • Landrum, A. R., & Olshansky, A. (2019). The role of conspiracy mentality in denial of science and susceptibility to viral deception about science. Politics and the Life Sciences, 38, 193-209. https://doi.org/10.1017/pls.2019.9

  • Landrum, A. R., Olshansky, A., & Richards, O. (2019). Differential susceptibility to misleading flat earth arguments on youtube. Media Psychology. Advance online publication. https://doi.org/10.1080/15213269.2019.1669461

  • Langdon, R. (2011). The cognitive neuropsychiatry of delusional belief. Cognitive Science, 2, 449-460. https://doi.org/10.1002/wcs.121

  • Lantian, A., Muller, D., Nurra, C., & Douglas, K. M. (2017). “I know things they don’t know!” The role of need for uniqueness in belief in conspiracy theories. Social Psychology, 48, 160-173. https://doi.org/10.1027/1864-9335/a000306

  • Lantian, A., Muller, D., Nurra, C., & Douglas, K. M. (2016). Measuring belief in conspiracy theories: Validation of a French and English single-item scale. International Review of Social Psychology, 29, 1-14. https://doi.org/10.5334/irsp.8

  • Larson, H. J. (2013). Negotiating vaccine acceptance in an era of reluctance. Human Vaccines & Immunotherapeutics, 9, 1779-1781. https://doi.org/10.4161/hv.25932

  • Larson, H. J. (2018). The biggest pandemic risk? Viral misinformation. Nature, 562, Article 309. https://doi.org/10.1038/d41586-018-07034-4

  • Larson, H. J., Clarke, R. M., Jarrett, C., Eckersberger, E., Levine, Z., Schulz, W. S., & Paterson, P. (2018). Measuring trust in vaccination: A systematic review. Human Vaccines & Immunotherapeutics, 14, 1599-1609. https://doi.org/10.1080/21645515.2018.1459252

  • Leman, P. J., & Cinnirella, M. (2013). Beliefs in conspiracy theories and the need for cognitive closure. Frontiers in Psychology, 4, Article 378. https://doi.org/10.3389/fpsyg.2013.00378

  • Lewandowsky, S., Cook, J., & Ecker, U. K. H. (2017). Letting the gorilla emerge from the mist: Getting past post-truth. Journal of Applied Research in Memory and Cognition, 6, 418-424. https://doi.org/10.1016/j.jarmac.2017.11.002

  • Lewandowsky, S., Ecker, U. K. H., & Cook, J. (2017). Beyond misinformation: Understanding and coping with the “post-truth” era. Journal of Applied Research in Memory and Cognition, 6, 353-369. https://doi.org/10.1016/j.jarmac.2017.07.008

  • Lewandowsky, S., Gignac, G. E., & Oberauer, K. (2013). The role of conspiracist ideation and worldviews in predicting rejection of science. PLoS One, 8(10), Article e75637. https://doi.org/10.1371/journal.pone.0075637

  • Lyons, B., Merola, V., & Reifler, J. (2019). Not just asking questions: Effects of implicit and explicit conspiracy information about vaccines and genetic modification. Health Communication, 34, 1741-1750. https://doi.org/10.1080/10410236.2018.1530526

  • Mancosu, M., Vassallo, S., & Vezzoni, C. (2017). Believing in conspiracy theories: Evidence from an exploratory analysis of Italian survey data. South European Society and Politics, 22, 327-344. https://doi.org/10.1080/13608746.2017.1359894

  • Marchlewska, M., Cichocka, A., & Kossowska, M. (2018). Addicted to answers: Need for cognitive closure and the endorsement of conspiracy beliefs. European Journal of Social Psychology, 48, 109-117. https://doi.org/10.1002/ejsp.2308

  • Marsh, E. J., & Yang, B. W. (2017). A call to think broadly about information literacy. Journal of Applied Research in Memory and Cognition, 6, 401-404. https://doi.org/10.1016/j.jarmac.2017.09.012

  • Mashuri, A., & Zaduqisti, E. (2014a). The role of social identification, intergroup threat, and out-group derogation in explaining belief in conspiracy theory about terrorism in Indonesia. International Journal of Research Studies in Psychology, 3, 35-50. https://doi.org/10.5861/ijrsp.2013.446

  • Mashuri, A., & Zaduqisti, E. (2014b). We believe in our conspiracy if we distrust you: The role of intergroup distrust in structuring the effect of Islamic identification, competitive victimhood, and group incompatibility on belief in a conspiracy theory. Journal of Tropical Psychology, 4, Article e11. https://doi.org/10.1017/jtp.2014.11

  • McKay, R. T., & Dennett, D. C. (2009). The evolution of misbelief. Behavioral and Brain Sciences, 32, 493-510. https://doi.org/10.1017/S0140525X09990975

  • Miller, J. M., Saunders, K. L., & Farhart, C. E. (2016). Conspiracy endorsement as motivated reasoning: The moderating roles of political knowledge and trust. American Journal of Political Science, 60, 824-844. https://doi.org/10.1111/ajps.12234

  • Moulding, R., Niz-Carnell, S., Schnabel, A., Nedeljkovic, M., Burnside, E. E., Lentini, A. F., & Mehzabin, N. (2016). Better the devil you know than a world you don’t? Intolerance of uncertainty and worldview explanations for belief in conspiracy theories. Personality and Individual Differences, 98, 345-354. https://doi.org/10.1016/j.paid.2016.04.060

  • Newheiser, A.-K., Farias, M., & Tausch, N. (2011). The functional nature of conspiracy beliefs: Examining the underpinnings of belief in the Da Vinci Code conspiracy. Personality and Individual Differences, 51, 1007-1011. https://doi.org/10.1016/j.paid.2011.08.011

  • Nichols, T. (2014, January 17). The death of expertise. The Federalist. Retrieved from https://thefederalist.com/2014/01/17/the-death-of-expertise

  • Nichols, T., & Smith, J. K. A. (2017, Spring). The death of expertise as a decline of trust. Comment. Retrieved from https://www.cardus.ca/comment/article/the-death-of-expertise-as-a-decline-of-trust

  • Nyhan, B., & Reifler, J. (2015). Does correcting myths about the flu vaccine work? An experimental evaluation of the effects of corrective information. Vaccine, 33, 459-464. https://doi.org/10.1016/j.vaccine.2014.11.017

  • Nyhan, B., Reifler, J., Richey, S., & Freed, G. L. (2014). Effective messages in vaccine promotion: A randomized trial. Pediatrics, 133, e835-e842. https://doi.org/10.1542/peds.2013-2365

  • Oliver, J. E., & Wood, T. J. (2014a). Conspiracy theories and the paranoid style(s) of mass opinion. American Journal of Political Science, 58, 952-966. https://doi.org/10.1111/ajps.12084

  • Oliver, J. E., & Wood, T. (2014b). Medical conspiracy theories and health behaviors in the United States. JAMA Internal Medicine, 174, 817-818. https://doi.org/10.1001/jamainternmed.2014.190

  • O’Neill, O. (2020). Trust and accountability in a digital age. Philosophy, 95, 3-17. https://doi.org/10.1017/S0031819119000457

  • Ozawa, S., Paina, L., & Qiu, M. (2016). Exploring pathways for building trust in vaccination and strengthening health system resilience. BMC Health Services Research, 16, Suppl 7Article 639. https://doi.org/10.1186/s12913-016-1867-7

  • Parsons, S., Simmons, W., Shinhoster, F., & Kilburn, J. (1999). A test of the grapevine: An empirical examination of conspiracy theories among African Americans. Sociological Spectrum, 19, 201-222. https://doi.org/10.1080/027321799280235

  • Pennycook, G., Cannon, T. D., & Rand, D. G. (2018). Prior exposure to perceived accuracy of fake news. Journal of Experimental Psychology: General, 147, 1865-1880. https://doi.org/10.1037/xge0000465

  • Pennycook, G., & Rand, D. G. (2019). Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning that by motivated reasoning. Cognition, 188, 39-50. https://doi.org/10.1016/j.cognition.2018.06.011

  • Pennycook, G., & Rand, D. G. (2020). Who falls for fake news? The roles of bullshit receptivity, overclaiming, familiarity, and analytic thinking. Journal of Personality, 88, 185-200. https://doi.org/10.1111/jopy.12476

  • Peters, E. R., Joseph, S. A., & Garety, P. A. (1999). Measurement of delusional ideation in the normal population: Introducing the PDI (Peters et al. Delusions Inventory). Schizophrenia Bulletin, 25, 553-576. https://doi.org/10.1093/oxfordjournals.schbul.a033401

  • Pierre, J. M. (2001). Faith or delusion: At the crossroads of religion and psychosis. Journal of Psychiatric Practice, 7, 163-172. https://doi.org/10.1097/00131746-200105000-00004

  • Pierre, J. M. (2019). Integrating non-psychiatric models of delusion-like beliefs into forensic psychiatric assessment. The Journal of the American Academy of Psychiatry and the Law, 47, 171-179. https://doi.org/10.29158/JAAPL.003833-19

  • Pierre, J. M. (2020). Forensic psychiatry versus the varieties of delusion-like belief. The Journal of the American Academy of Psychiatry and the Law, 48(3), 1-8. https://doi.org/10.29158/JAAPL.200013-20

  • Raab, M. H., Auer, N., Ortlieb, S. A., & Carbon, C.-C. (2013). The Sazzarin effect: The presence of absurd statements in conspiracy theories makes canonical information less plausible. Frontiers in Psychology, 4, Article 453. https://doi.org/10.3389/fpsyg.2013.00453

  • Raab, M. H., Ortlieb, S. A., Auer, N., Guthmann, K., & Carbon, C.-C. (2013). Thirty shades of truth: Conspiracy theories as stories of individuation, not of pathological delusion. Frontiers in Psychology, 4, Article 406. https://doi.org/10.3389/fpsyg.2013.00406

  • Richey, S. (2017). A birther and a truther: The influence of the authoritarian personality on conspiracy beliefs. Politics & Policy, 45, 465-485. https://doi.org/10.1111/polp.12206

  • Rosenblum, N. L., & Muirhead, R. (2019). A lot of people are saying: The new conspiracism and the assault on democracy. Princeton, NJ, USA: Princeton University Press.

  • Rosselli, R., Martini, M., & Bragazzi, N. L. (2016). The old and the new: Vaccine hesitancy in the era of the Web 2.0. Challenges and opportunities. Journal of Preventive Medicine and Hygiene, 57, E47-E50. https://doi.org/10.15167/2421-4248/jpmh2016.57.1.572

  • Sapountzis, A., & Condor, S. (2013). Conspiracy accounts as intergroup theories: Challenging dominant understandings of social power and political legitimacy. Political Psychology, 34, 731-752. https://doi.org/10.1111/pops.12015

  • Saunders, K. L. (2017). The impact of elite frames and motivated reasoning on beliefs in a global warming conspiracy: The promise and limits of trust. Research & Politics, 4(3), 1-9. https://doi.org/10.1177/2053168017717602

  • Seifert, C. M. (2017). The distributed influence of misinformation. Journal of Applied Research in Memory and Cognition, 6, 397-400. https://doi.org/10.1016/j.jarmac.2017.09.003

  • Simmons, W. P., & Parsons, S. (2005). Beliefs in conspiracy theories among African Americans: A comparison of elites and masses. Social Science Quarterly, 86, 582-598. https://doi.org/10.1111/j.0038-4941.2005.00319.x

  • Smallpage, S. M., Enders, A. M., & Uscinski, J. E. (2017). The partisan contours of conspiracy theory beliefs. Research & Politics, 4(4), 1-7. https://doi.org/10.1177/2053168017746554

  • Sperber, D., Clément, F., Heintz, C., Mascaro, O., Mercier, H., Origgi, G., & Wilson, D. (2010). Epistemic vigilance. Mind & Language, 25, 359-393. https://doi.org/10.1111/j.1468-0017.2010.01394.x

  • Ståhl, T., & van Prooijen, J.-W. (2018). Epistemic rationality: Skepticism toward unfounded beliefs requires sufficient cognitive ability and motivation to be rational. Personality and Individual Differences, 122, 155-163. https://doi.org/10.1016/j.paid.2017.10.026

  • Stein, R. A. (2017). The golden age of anti-vaccine conspiracies. Germs, 7, 168-170. https://doi.org/10.18683/germs.2017.1122

  • Stempel, C., Hargrove, T., & Stempel, G. H., III. (2007). Media use, social structure, and belief in 9/11 conspiracy theories. Journalism & Mass Communication Quarterly, 84, 353-372. https://doi.org/10.1177/107769900708400210

  • Stilgoe, J. (2016). Scientific advice on the move: The UK mobile phone risk issue as a public experiment. Palgrave Communications, 2, Article 16028. https://doi.org/10.1057/palcomms.2016.28

  • Stojanov, A., & Halberstadt, J. (2019). The conspiracy mentality scale: Distinguishing between irrational and rational suspicion. Social Psychology, 50(4), 215-232. https://doi.org/10.1027/1864-9335/a000381

  • Sullivan, D., Landau, M. J., & Rothschild, Z. K. (2010). An existential function of enemyship: Evidence that people attribute influence to personal and political enemies to compensate for threats to control. Journal of Personality and Social Psychology, 98(3), 434-449. https://doi.org/10.1037/a0017457

  • Sunstein, C. R., & Vermeule, A. (2009). Conspiracy theories: Causes and cures. Journal of Political Philosophy, 17, 202-227. https://doi.org/10.1111/j.1467-9760.2008.00325.x

  • Swami, V. (2012). Social psychological origins of conspiracy theories: The case of the Jewish conspiracy theory in Malaysia. Frontiers in Psychology, 3, Article 280. https://doi.org/10.3389/fpsyg.2012.00280

  • Swami, V., Chamarro-Premuzic, T., & Furnham, A. (2010). Unanswered questions: A preliminary investigation of personality and individual difference predictors of 9/11 conspiracist beliefs. Applied Cognitive Psychology, 24, 749-761. https://doi.org/10.1002/acp.1583

  • Swami, V., Coles, R., Stieger, S., Pietschnig, J., Furnham, A., Rehim, S., & Voracek, M. (2011). Conspiracist ideation in Britain and Austria: Evidence of a monological belief system and associations between individual psychological differences and real-world and fictitious conspiracy theories. British Journal of Psychology, 102, 443-463. https://doi.org/10.1111/j.2044-8295.2010.02004.x

  • Swami, V., Voracek, M., Stieger, S., Tran, U. S., & Furnham, A. (2014). Analytic thinking reduces belief in conspiracy theories. Cognition, 133, 572-585. https://doi.org/10.1016/j.cognition.2014.08.006

  • Taylor, L. E., Swerdfeger, A. L., & Eslick, G. D. (2014). Vaccines are not associated with autism: An evidence-based meta-analysis of case-control and cohort studies. Vaccine, 32, 3623-3629. https://doi.org/10.1016/j.vaccine.2014.04.085

  • Thompson, H. S., Validmarsdottir, H. B., Winkel, G., Jandorf, L., & Redd, W. (2004). The group-based Medical Mistrust Scale: Psychometric properties and association with breast cancer screening. Preventive Medicine, 38, 209-218. https://doi.org/10.1016/j.ypmed.2003.09.041

  • Uscinski, J. E., Klofstad, C., & Atkinson, M. D. (2016). What drives conspiratorial beliefs? The role of informational cues and predispositions. Political Research Quarterly, 69, 57-71. https://doi.org/10.1177/1065912915621621

  • Valkenberg, P. M., & Peter, J. (2013). The differential susceptibility to media effects model. Journal of Communication, 63, 221-243. https://doi.org/10.1111/jcom.12024

  • van Elk, M. (2015). Perceptual biases in relation to paranormal and conspiracy beliefs. PLoS One, 10, Article e0130422. https://doi.org/10.1371/journal.pone.0130422

  • van Prooijen, J.-W. (2017). Why education predicts decreased belief in conspiracy theories. Applied Cognitive Psychology, 31, 50-58. https://doi.org/10.1002/acp.3301

  • van Prooijen, J.-W. (2019). Belief in conspiracy theories: Looking beyond gullibility. In J. P. Forgas & R. F. Baumeister (Eds.), The social psychology of gullibility: Fake news, conspiracy theories, and irrational beliefs (pp. 312-322). Oxon, United Kingdom: Routledge.

  • van Prooijen, J.-W., & Acker, M. (2015). The influence of control on belief in conspiracy theories: Conceptual and applied extensions. Applied Cognitive Psychology, 29, 753-761. https://doi.org/10.1002/acp.3161

  • van Prooijen, J.-W., & Douglas, K. M. (2017). Conspiracy theories as part of history: The role of societal crisis situations. Memory Studies, 10, 323-333. https://doi.org/10.1177/1750698017701615

  • van Prooijen, J.-W., & Douglas, K. M. (2018). Belief in conspiracy theories: Basic principles of an emerging research domain. European Journal of Social Psychology, 48, 897-908. https://doi.org/10.1002/ejsp.2530

  • van Prooijen, J.-W., Douglas, K. M., & De Inocencio, C. (2018). Connecting the dots: Illusory pattern perception predicts belief in conspiracies and the supernatural. European Journal of Social Psychology, 48, 320-335. https://doi.org/10.1002/ejsp.2331

  • van Prooijen, J.-W., Krouwel, A. P. M., & Polet, T. V. (2015). Political extremism predicts belief in conspiracy theories. Social Psychological and Personality Science, 6, 570-578. https://doi.org/10.1177/1948550614567356

  • Varis, P. (2019). Conspiracy theorising online: Memes as a conspiracy theory genre (Tilburg Papers in Culture Studies, No. 238). Retrieved from https://research.tilburguniversity.edu/en/publications/conspiracy-theorising-online-memes-as-a-conspiracy-theory-genre

  • Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359, 1146-1151. https://doi.org/10.1126/science.aap9559

  • Wagner-Egger, P., Delouveé, S., Gauvrit, N., & Dieguez, S. (2018). Creationism and conspiracism share a common teleologic bias. Current Biology, 28, R867-R868. https://doi.org/10.1016/j.cub.2018.06.072

  • Walter, D., Ophir, Y., & Jamieson, K. H. (2020). Russian Twitter accounts and the partisan polarization of vaccine discourse, 2015-2017. American Journal of Public Health, 110, 718-724. https://doi.org/10.2105/AJPH.2019.305564

  • Westergaard, R. P., Beach, M. C., Saha, S., & Jacobs, E. A. (2014). Racial/ethnic differences in trust in health care: HIV conspiracy beliefs and vaccine research participation. Journal of General Internal Medicine, 29, 140-146. https://doi.org/10.1007/s11606-013-2554-6

  • Whaley, A. L. (2001). Cultural mistrust: An important psychological construct for diagnosis and treatment of African Americans. Professional Psychology: Research and Practice, 32, 555-562. https://doi.org/10.1037/0735-7028.32.6.555

  • Whyte, K. P., & Crease, R. P. (2010). Trust, expertise, and the philosophy of science. Synthese, 177, 411-425. https://doi.org/10.1007/s11229-010-9786-3

  • Wood, M. J. (2016). Some dare call it conspiracy: Labeling something a conspiracy theory does not reduce belief in it. Political Psychology, 37, 695-705. https://doi.org/10.1111/pops.12285

  • Wood, M. J. (2017). Conspiracy suspicions as a proxy for beliefs in conspiracy theories: Implications for theory and measurement. British Journal of Psychology, 108, 507-527. https://doi.org/10.1111/bjop.12231

  • Wood, M. J., & Douglas, K. M. (2013). “What about building 7?” A social psychological study of online discussion of 9/11 conspiracy theories. Frontiers in Psychology, 4, Article 409. https://doi.org/10.3389/fpsyg.2013.00409

  • Wood, M. J., Douglas, K. M., & Sutton, R. M. (2012). Dead and alive: Beliefs in contradictory conspiracy theories. Social Psychological and Personality Science, 3, 767-773. https://doi.org/10.1177/1948550611434786

  • YouGov. (2018, December 14). Brexit and Trump covers are more likely to believe in conspiracy theories. Retrieved from https://yougov.co.uk/topics/international/articles-reports/2018/12/14/brexit-and-trump-voters-are-more-likely-believe-co