Conspiracy theories reject authoritative accounts of reality in favor of some plot involving a group of people with malevolent intent that is deliberately kept secret from the public. Consistent with this definition, a two-component, socio-epistemic model is proposed whereby belief in conspiracy theories (BCT) can be understood as a result of mistrusting conventional knowledge and authoritative accounts (“epistemic mistrust”) along with a biased appraisal of false counter-narratives (“misinformation processing”). This model provides an overarching framework to better understand BCT more universally as a potentially normal socio-epistemic phenomenon, as well as from an individual standpoint based on a focused specificity of mistrust and exposure to misinformation.
Conspiracy Theories as Psychopathology
Over the past decade, conspiracy theories have breached mainstream awareness with the help of the internet and have garnered increasing attention based on their potential to steer both individual behavior and public policy into irrational directions and to cause harm (Bogart et al., 2010; Jolley & Douglas, 2014a; Jolley & Douglas, 2014b; Jolley et al., 2019; Jolley, Meleady, & Douglas, 2020; Oliver & Wood, 2014b). Consequently, research on conspiracy theories has emerged as a rapidly growing field straddling the psychological, political, and informational sciences with discoveries about etiologies and potential interventions that have significant practical relevance.
Much of the psychological research on conspiracy theories to date has focused on the search for associations between conspiracy theories and various psychological traits or cognitive biases that might account for why some people believe in them. For example, BCT has been correlated with higher levels of certain attribution and perceptual biases (Douglas et al., 2016; van Elk, 2015; van Prooijen, Douglas, & De Inocencio 2018, Wagner-Egger et al., 2018); conjunction fallacies (Brotherton & French, 2014); need for control, certainty, cognitive closure, and uniqueness (Imhoff & Lamberty, 2017; Lantian et al., 2017; Marchlewska, Cichocka, & Kossowska, 2018; Newheiser, Farias, & Tausch, 2011; van Prooijen & Acker, 2015); “bullshit receptivity” and lack of analytic thinking (Hart & Graether, 2018; Ståhl & van Prooijen, 2018; Swami et al., 2014); and paranoia/schizotypy (Dagnall et al., 2015; Darwin, Neave, & Holmes, 2011; Grzesiak-Feldman & Ejsmont, 2008; Hart & Graether, 2018).
This research approach has several limitations. First, it is grounded in the premise that BCT represents a kind of psychopathology (e.g. the “deficit model”), if not evidence of delusion or formal mental illness per se. However, surveys have found that the majority of respondents believe in at least one conspiracy theory (Goertzel, 1994; Oliver & Wood, 2014a, 2014b), suggesting that BCT, or what has been called generalized “conspiracist ideation” (Swami et al., 2011), is essentially a normal phenomenon. Indeed, many of the cognitive biases and other psychological quirks that have been found to be associated with BCT are universal, continuously-distributed traits varying in quantity as opposed to all-or-none variables or distinct symptoms of mental illness. They are present in those who do not believe in conspiracy theories and some of them, like need for uniqueness or closure, may be valued or adaptive in certain culturally-mediated settings (Cai et al., 2018; Kossowska, Dragon, & Bukowski, 2015). Others, like schizotypy, might be more tautological or redundant to BCT rather than an independent or interactive cause (Hart & Graether, 2018).
Another limitation of research on conspiracy theories to date is that associations between BCT and individual psychological variables have been inconsistent across studies. For example, some studies have not found any association between BCT and cognitive closure (Imhoff & Bruder, 2014; Leman & Cinnirella, 2013) or need for certainty (Moulding et al., 2016). Other studies have reported associations between BCT and specific Big Five personality traits (Swami, Chamarro-Premuzic, & Furnham, 2010), whereas a recent meta-analysis found no such association when effect sizes were aggregated (Goreis & Voracek, 2019). Lower levels of education have been found to be associated with BCT in some studies (Douglas et al., 2016; Federico, Williams, & Vitriol, 2018; Green & Douglas, 2018; Mancosu, Vassallo, & Vezzoni, 2017; Oliver & Wood, 2014a; Richey, 2017; Stempel, Hargrove, & Stempel, 2007), but not others (Bogart et al., 2010; Goertzel, 1994; Simmons & Parsons, 2005), suggesting more complex associations with other variables (van Prooijen, 2017). Overall, such inconsistencies likely stem from the fact that researchers have utilized different surveys and scales to measure BCT while exploring associations with a limited number of individual psychological variables within any single study (Brotherton, French, & Pickering, 2013; Goreis & Voracek, 2019; Hart & Graether, 2018).
The heterogeneity of cognitive peculiarities associated with BCT might also be explained by the fact that, with some notable exceptions, research on conspiracy theories often examines BCT in general rather than within specific or individual conspiracy theories. This broad approach has presumably been rationalized based on findings that belief in one conspiracy theory predicts belief in others (Goertzel, 1994; Lewandowsky, Gignac, & Oberauer, 2013), even when they are contradictory (Wood, Douglas, & Sutton, 2012) or contrived by researchers (Swami et al., 2011). However, the idea that BCT reflects a “monological belief system” or worldview whereby conspiracy theories are justified by other conspiracy theories (Goertzel, 1994) may be short-sighted (Douglas, Uscinski, et al., 2019; Franks et al., 2017; Klein, Clutton, & Polito, 2018) and in any case steers research away from the possibility that some psychological traits might be more likely to account for beliefs in some specific conspiracies, but not others. For example, need for closure and certainty might help to explain conspiracy theories emerging in the wake of “crisis situations” such as the assassination of JFK or the death of Princess Diana (van Prooijen & Douglas, 2017), but are less obviously relevant to beliefs in a flat Earth, “chem trails,” or ”pizzagate.” These are testable hypotheses that have for the most part not been explored in research to date, suggesting the need to examine individual conspiracy theory beliefs more closely in order to avoid unsubstantiated generalizations (Enders & Smallpage, 2018; Uscinski, Klofstad, & Atkinson, 2016).
The diversity of conspiracy theories deserves greater attention in research (Franks, Bangerter, & Bauer, 2013; Franks et al., 2017; Klein, Clutton, & Polito, 2018). For example, the heterogeneity of BCT may be best understood by not only examining different conspiracy theory themes, but also according to other cognitive dimensions. Research on delusional thinking suggests that different delusions have distinct pathophysiologies with some common underlying features (Corlett et al., 2010) and that individuals with thematically similar delusions vary quantitatively based on different cognitive dimensions such as conviction, preoccupation, distress, or functional impact (Peters, Joseph, & Garety, 1999). This is likely true of normal beliefs and misbeliefs including those related to religion, politics, and conspiracy theories as well (Barnby et al., 2019; Franks, Bangerter, & Bauer, 2013; Pierre, 2001). Indeed, factor analytic studies have provided evidence for distinct “facets” of conspiracy theories, suggesting that they are more multidimensional than is generally appreciated and that mapping out these dimensions could help to disentangle the inconsistent findings of existing research (Brotherton, French, & Pickering, 2013; Castanho Silva, Vegetti, & Littvay, 2017). For example, quantifying BCT along the dimension of conviction could be especially useful in distinguishing those who question authority and search for answers through online “research” (e.g. “fence sitters” with vaccine hesitancy) and those with full-blown conspiracy theory belief (e.g. “flat-Earthers” convinced of an international cover-up). Such quantitative dimensionalization may help to elucidate differences between general conspiracist ideation, more specific conspiracy theory endorsement, and points in between (Federico, Williams, & Vitriol, 2018; Franks et al., 2017).
Raab, Ortlieb, et al. (2013, p. 2) have noted that existing conspiracy theory questionnaires may draw “an artificial red line between believers and nonbelievers” of conspiracy theories, whereas BCT actually represents a narrative with “thirty shades of truth.” In keeping with this view, Stojanov and Halberstadt (2019) have recently developed a scale that captures both irrational “conspiracy theory ideation” (as measured by statements like “some things that everyone accepts as true are in fact hoaxes created by people in power”) and rational “skepticism” (e.g. “some things are not as they seem”) as separable components of BCT. In their validation study, skepticism was associated with belief in accounts of plausible accounts of generic corruption, while only conspiracist ideation predicted belief in specific implausible conspiracy theories. These findings support a model in which BCT exists on a continuum and that understanding the dimensional factors that determine where a belief lies on that continuum is an essential direction of future research.
A Two-Component, Socio-Epistemic Model of Belief in Conspiracy Theories
A “two-factor model” of delusional thinking has been proposed to explain 1) why an implausible idea comes to mind in the first place and 2) why the belief is subsequently adopted rather than rejected (Davies et al., 2001; Langdon, 2011). Although the two-factor model remains unvalidated and is not without its critics (Corlett, 2019), it provides an analogous framework to explain other types of misbelief (Connors & Halligan, 2015; McKay & Dennett, 2009) including conspiracy theories. It has been noted that BCT scales used in research are designed to measure both “positive” beliefs that “malevolent groups are conspiring” and “negative” beliefs that “official accounts are false” (Douglas, Uscinski, et al., 2019, p. 6). Accordingly, a two-component, socio-epistemic model of conspiracy theories is proposed that includes distinct elements of belief negation (why one goes looking for conspiracy theories in the first place) and affirmation (why one comes to believe a particular conspiracy theory).
Component 1: Epistemic Mistrust
As noted above, previous research has suggested that generalized conspiracist ideation is a unifying force underlying all BCT (Swami et al., 2011). Conspiracist ideation has been characterized as the attribution of events to hidden, intentional forces; “a natural attraction” to “Manichean narratives” related to universal struggles between good and evil (Oliver & Wood, 2014a, p. 954); or as belief in the “essential malevolence” and deceptiveness of “officialdom” (Wood, Douglas, & Sutton, 2012, p. 772). Indeed, some researchers have developed a single-item BCT questionnaire that describes the possibility that social and political events “have been planned and secretly prepared by a covert alliance of powerful individuals or organizations” and quantifies agreement with the single statement, “I think that the official version of the events given by the authorities very often hides the truth” (Lantian et al., 2016, p. 10).
As an explanation for BCT, general conspiracist ideation has two notable shortcomings. First, it is limited by tautology (i.e. it suggests that “conspiracy theorists” prefer conspiracy theories because they find conspiracy theories appealing). Second, it implies that BCT is understandable as an individual psychological phenomenon divorced from any socio-cultural context (Klein, Clutton, & Dunn, 2019; Stempel, Hargrove, & Stempel, 2007). The two-component, socio-epistemic model of BCT proposed here instead argues that conspiracist ideation is better conceptualized as a result of epistemic mistrust. As a universal first component of BCT, epistemic mistrust refers to mistrust of knowledge or, framed within its proper socio-cultural context, mistrust of authoritative informational accounts. Epistemic mistrust is often rooted in interpersonal mistrust and can vary on a continuum from generic skepticism or epistemic vigilance on one end (Sperber et al., 2010), to a middle-ground of suspiciousness including the kind of subclinical paranoid ideation seen in schizotypy (Darwin, Neave, & Holmes, 2011; Grzesiak-Feldman & Ejsmont, 2008; Imhoff & Lamberty, 2018; Wood, 2017), to a more specific and potentially pathological focus on allegedly interconnected institutions of authority or epistemic nihilism at the extreme. According to this view, BCT does not represent a primary attraction to conspiracist narratives so much as a rejection of authoritative accounts, accepted explanations, and conventional wisdom.
The idea that mistrust lies at the root of BCT is not new (Hofstadter, 1964; Basham, 2001) and is supported by the few studies that have variably defined it as a correlational variable (Abalakina-Paap et al., 1999; Goertzel, 1994; Golec de Zavala & Federico, 2018; Green & Douglas, 2018; Parsons et al., 1999). Wood and Douglas (2013, p. 7) likewise found that for 9/11 conspiracy theory adherents, “the search for truth consists mostly of finding ways in which the official story cannot be true… [with] much less of a focus on defending coherent explanations that can better account for the available evidence.” Despite these associations between mistrust and BCT however, the significance of epistemic mistrust may be underappreciated due to its limited use as an inconsistently defined variable as well as methodological assumptions that it represents a generalized individual trait like paranoia rather than something specific to its social context. For example, generalized interpersonal mistrust is significantly associated with some types of conspiracy beliefs, but not others (Oliver & Wood, 2014a). Mistrust in government has been found to be associated with general conspiracist ideation and political conspiracy theories (Imhoff & Lamberty, 2018; Richey, 2017), but not conspiracy theories related to other themes (van Prooijen & Acker, 2015). Other studies have found BCT to be specifically associated with mistrust of authority or “high power” groups (Imhoff & Bruder, 2014; Imhoff, Lamberty, & Klein, 2018; Swami, Chamarro-Premuzic, & Furnham, 2010; Swami et al., 2011). These inconsistencies suggest that while general mistrust is associated with BCT, endorsement of specific conspiracy theories may depend on a continuum of mistrust that can be narrowly focused on individual institutions of epistemic authority.
An “ascending typology” has been proposed that charts different phases of BCT progression from stages of skeptically questioning official narratives to full-blown embrace of collective conspiracy theories and the belief that all reality is an illusion (Franks et al., 2017). Mistrust runs through these narrative accounts as a central component, supporting the view that BCT represents a continuum. According to this perspective, general conspiracist ideation that conforms with a “monological belief system” may represent an extreme that can account for the endorsement of logically contradictory conspiracy theories (Wood, Douglas, & Sutton, 2012). In contrast, epistemic mistrust can be much more focused on specific informational sources or institutions of epistemic authority including individual governments or political parties, scientific organizations, or corporations. This potentially narrow specificity of mistrust helps to account for the diversity of conspiracy theory themes and the fact that while belief in one conspiracy might predict belief in another, it does not predict belief in all others (Goertzel, 1994; Klein, Clutton, & Polito, 2018; Mancosu, Vassallo, & Vezzoni, 2017; Oliver & Wood, 2014a, 2014b; Stojanov & Halberstadt, 2019).
Indeed, contrary to the monological belief system model, BCT tends not to include ideologically inconsistent conspiracy theories that cross political party lines (Miller, Saunders, & Farhart, 2016; Oliver & Wood, 2014a). Smallpage and colleagues (2017, p. 4) reported evidence from online survey data to support the conclusion that “although conspiracy theories are often attributed to cognitive hiccups, psychological traits, or psychopathologies, they actually follow the contours of more familiar partisan battles in the age of polarization.” Studies have revealed that while “general” conspiracy theories may be equally endorsed among liberals and conservatives, the endorsement of politically-relevant “ideological” conspiracies tend to be aligned with self-reported political orientation (Enders & Smallpage, 2018; Enders, Smallpage, & Lupton, 2018; Miller, Saunders, & Farhart, 2016; Oliver & Wood, 2014a; Smallpage, Enders, & Uscinski 2017, Stempel, Hargrove, & Stempel, 2007). Others have found evidence of associations between BCT and populism (Castanho Silva, Vegetti, & Littvay, 2017) or with extremism on both sides of the political fence (Krouwel et al., 2017; van Prooijen, Krouwel, & Polet, 2015) rather than any unidirectional left/right bias. Collectively, these findings support the argument that political conspiracy theories are best understood as a form of ideologically motivated reasoning, with trust demonstrated to be an important mitigating factor that can moderate motivated reasoning within BCT (Miller, Saunders, & Farhart, 2016; Saunders, 2017).
Cultural Mistrust, Trust Violations, and Conspiracy Theories
If BCT is rooted in mistrust, the inverse corollary is that trust underlies much of our belief in conventional wisdom, voices of epistemic authority, or scientific consensus. Although some epistemologists have claimed that trust and knowledge are antithetical, with trust implying ignorance, it has been argued that on the contrary, “trust in the testimony of others is necessary to ground much of our knowledge, and that this trust involves trust in the character of testifiers” (Hardwig, 1991, p. 702). It has been further noted that “trust relationships require an active choice on behalf of the trusting party” (Larson et al., 2018, p. 1599) involving “leaps of faith” (Brownlie & Howson, 2005). With particular relevance to medical conspiracy theory beliefs, Whyte and Crease (2010, p. 412) proposed that “trust means deferring with comfort and confidence to others, about something beyond our knowledge or power, in ways that can potentially hurt us.”
Although BCT is often modeled as pathological, with mistrust all but equated with paranoia, an important, if sometimes neglected, counterexample is well-illustrated by numerous studies showing that BCT is common within the African American population (Goertzel, 1994; Stempel, Hargrove, & Stempel, 2007). Among the most common conspiracy theories in this population are those related to themes of “benign neglect,” which are predicted by generalized mistrust and having been a victim of police harassment, without correlation to political alignment (Parsons et al., 1999; Simmons & Parsons, 2005). The intersection of epistemic mistrust and BCT within the African American community has been explored more specifically within Human Immunodeficiency Virus (HIV)/Acquired Immunodeficiency Syndrome (AIDS)-related conspiracy theories such as beliefs that HIV is a man-made virus created and spread by the Central Intelligence Agency or that treatments are either withheld from the African American community or cause rather than treat AIDS (Ball, Lawson, & Alim, 2013). A survey of 500 US African Americans found that over half believed that AIDS information is withheld from the public and that a cure for AIDS exists but is being withheld from the poor, with nearly half believing that HIV is a man-made virus and that those taking new antiviral medications are “human guinea pigs for the government” (Bogart & Thorburn, 2005). HIV-related conspiracy beliefs have been found to be more prevalent among African Americans compared to Whites (Clark et al., 2008; Westergaard et al., 2014) and belief in at least one HIV-related conspiracy theory has been reported in over 60% of HIV positive African Americans (Bogart et al., 2010; Bogart et al., 2016). Such conspiracy theories have important implications for healthcare based on associations between their belief and lower rates of condom use (Bogart & Thorburn, 2005), preexposure HIV prophylaxis (Brooks et al., 2018), and antiviral treatment adherence for those who are HIV positive (Bogart et al., 2010; Bogart et al., 2016).
Medical mistrust and HIV-related conspiracy beliefs within the African American community have been attributed to misinformation transmitted by word of mouth through social networks (Bogart et al., 2016; Parsons et al., 1999) as “narratives and contemporary legends” rooted in historical experiences and healthcare disparities that have engendered mistrust (Heller, 2015). Indeed, independent of BCT, it has been found that African American mistrust of white people is negatively correlated with knowledge about HIV transmission (Klonoff & Landrine, 1997) and that general medical mistrust is associated with lower medication adherence with HIV medications (Dale et al., 2016). Although mistrust within the African American community has been historically labeled “cultural paranoia” as if a symptom of pathology, the community’s modern mistrust in medicine may stem from lived experiences such as the 1932-1979 Tuskegee Syphilis Study if not others that predate it (Ball, Lawson, & Alim, 2013; Heller, 2015). “Cultural paranoia” has therefore been reframed as “cultural mistrust” (Whaley, 2001), defined as a “racism reaction” characterized by “a tendency to distrust Whites based upon a legacy of direct or vicarious exposure to racism or unfair treatment by Whites” (Thompson et al., 2004, p. 210). Although mistrust involves a socio-cognitive appraisal that is potentially prone to error on a continuum with paranoia, both generalized medical mistrust as well as more specific HIV-related conspiracy theories within the African American community highlight that it can also arise from lived experience including violations of trust and chronic social devaluation (Davis, Wetherell, & Henry, 2018). In other words, epistemic mistrust is often associated with interpersonal and cultural mistrust that need not be pathological and on the contrary is, like trust, often earned.
Societal Threat, Intergroup Conspiracies, and Racism
In addition to institutional neglect or abuse, known instances of corruption and occasions when conspiracy theories turn out to be true provide epistemic justification for mistrust and BCT. In recognition of the sociopolitical forces driving conspiracy theories, some authors have normalized them as “general political attitudes” (Imhoff & Bruder, 2014) or merely “another type of political discourse” (Oliver & Wood, 2014a). Indeed, since conspiracy theories are essentially a form of “counter-discourse” that opposes official explanations (Sapountzis & Condor, 2013) and “speaks (un)truth to power” (Imhoff & Bruder, 2014), it could be argued that all conspiracy theories arise out of power inequities such that they are often related to a political theme.
Some authors have suggested that more specific mistrust of those with political power is a defining feature of conspiracy theories (Imhoff & Bruder, 2014; Imhoff & Lamberty, 2018), with studies reporting associations between BCT and political cynicism, feelings of powerlessness, and anomie (Abalakina-Paap et al., 1999; Goertzel, 1994; Swami, Chamarro-Premuzic, & Furnham, 2010; Swami et al., 2011). However, conspiracy theories are hardly the exclusive domain of those lacking in political power, as evidenced by their embrace by some populist dictators and political parties whether in power or not (Bergmann, 2018; Castanho Silva, Vegetti, & Littvay, 2017; YouGov, 2018). Additionally, it is often minority groups lacking in power that find themselves the “targets” rather than the “perpetrators” of intergroup conspiracies theories.
It has been observed that conspiracy theories frequently emerge in the setting of political upheaval and societal crises such that they are inevitable in the wake of pivotal traumas such as the death of JFK or 9/11 (van Prooijen & Douglas, 2017). Such an association suggests that cognitive and emotional needs for closure, control, or security may be especially relevant to BCT at such times (Newheiser, Farias, & Tausch, 2011; van Prooijen & Douglas, 2017). This may be especially true when official explanations for such events are found to be wanting, either because they are still being formulated or are lacking altogether (Marchlewska, Cichocka, & Kossowska, 2018) or when there is a mismatch between the magnitude of the event (e.g. JFK’s assassination) and the available explanation (e.g. a lone gunman). The two-component, socio-epistemic model of BCT further hypothesizes that uncertainty and loss of control should be understood in relation to loss of trust in the institutions that are expected to keep people safe. This dynamic may have particular relevance to intergroup conspiracies that have been linked to both a need for control and a need to blame others (Chayinska & Minescu, 2018; Imhoff & Bruder, 2014; Kofta, Soral, & Bilewicz, 2020).
Swami (2012, p. 7) found that within a Malaysian sample, belief in a conspiracy theory about Jews was not so much related to belief in other conspiracy theories or general conspiracist ideation as it was to “ideological needs” and “specific political expressions within a particular geopolitical context.” Within an intergroup setting, such ideological needs have been identified as needs for control or to blame, along with ingroup social identification and collective narcissism (Cichocka et al., 2016; Federico, Williams, & Vitriol, 2018; Golec de Zavala & Federico, 2018; Mashuri & Zaduqisti, 2014a; Sapountzis & Condor, 2013). However, while ingroup narcissism has been found to predict BCT about outgroups, it appears to confer a kind of immunity against BCT related to one’s own ingroup (Cichocka et al., 2016). Instead, intergroup mistrust may serve as a powerful mediator between loss of control and BCT where blame is redirected from ingroup targets to mistrusted outgroups (Mashuri & Zaduqisti, 2014b).
The relevance of mistrust to intergroup conspiracy theories becomes rather more complicated when examining diverse groups operating within larger societies. For example, contrary to expectations, mistrust of one’s own government as measured by belief in political corruption has not been found to necessarily increase BCT about one’s own ingroup (Chayinska & Minescu, 2018). Instead, it appears that “system identity threats” – fears that an ingroup society is coming undone – can increase BCT about the malevolent intentions of minority ingroup members through a self-protective mechanism (Jolley, Douglas, & Sutton, 2018). Such dynamics have been used to argue that intergroup conspiracy theories play a scapegoating role in the service of maintaining a sense of control, displacing epistemic mistrust in the face of political upheaval to a minority group falsely imbued with omnipotence and nefarious intentions (Federico, Williams, & Vitriol, 2018; Imhoff & Bruder, 2014; Jolley, Douglas, & Sutton, 2018; Kofta, Soral, & Bilewicz, 2020). In this fashion, epistemic mistrust is often redirected at pre-existing targets of interpersonal and cultural mistrust in the service of protecting narcissistic needs. In other words, mistrust based on “conspiracy stereotypes” (Kofta & Sedek, 2005) can determine who becomes a scapegoat within and across societies. This intergroup dynamic is further supported by research demonstrating that at the level of the individual, the attribution of exaggerated influence and power to personal and political enemies serves to compensate for threats to control (Sullivan, Landau, & Rothschild, 2010).
Such false attributions of power and malevolent intent contrast with epistemic mistrust on the part of minority groups that stem from actual trust violations and racial oppression discussed in the previous section. When minority groups are the targets of conspiracy theories, a fine line separates mechanisms designed to maintain feelings of ingroup control or narcissism from frank racism. There can also be a slippery slope where BCT takes the form of more pervasive xenophobia, extending beyond a single racial group to other mistrusted minorities or among immigrants more generally through a “secondary transfer effect” (Jolley, Meleady, & Douglas, 2020; Kofta & Sedek, 2005; Swami, 2012).
Component 2: Misinformation Processing
It has been argued that BCT is fundamentally rooted in ignorance or “denialism” (Kalichman, 2014). For example, Sunstein and Vermeule (2009, p. 211) claimed that people who come to believe in conspiracy theories suffer from “crippled epistemologies” whereby they “know very few things, and what they know is wrong.” The two-component, socio-epistemic model instead contends that BCT is a more active process whereby loss of trust in traditional institutions of authority results in an epistemic vacuum that can send individuals “down the rabbit hole” in search of alternative explanations. What remains to be explained is why, when individuals go looking for answers, some gravitate towards conspiracy theories. As noted previously, general conspiracist ideation by itself offers an explanation too close to tautology. When epistemic mistrust is instead recognized as a crucial first component of BCT, conspiracy theories are better understood as direct counter-narratives stemming from contested epistemic authority (Harambam & Aupers, 2015). In that sense, conspiracy theories are appealing to those with epistemic mistrust because they represent the antithesis of authoritative accounts.
The two-component, socio-epistemic model of BCT further highlights that conspiracy theories are something of a misnomer. In contrast to delusions, conspiracy theories are typically shared beliefs that lack a self-referential component (Pierre, 2020). Indeed, when individuals who mistrust authoritative accounts search for alternative answers, conspiracy theories do not arise de novo; they are already “out there,” lying in wait. In this sense, BCT often does not involve “theorizing” so much as sifting through information, deciding what to believe and what to disregard. Although some “conspiracy theorists” may be genuinely theorizing, most are crafting a narrative based on the synthesis of available information (Landrum, Olshansky, & Richards, 2019; Raab, Ortlieb, et al., 2013) and might be more appropriately described as “conspiracy theists.”
Modeling conspiracy theories along these lines, it is proposed that the second component of BCT involves vulnerability to misinformation and a biased search for narratives that counter epistemic authority. Although some authors have characterized conspiracy theorists as “naïve scientists” (Kareklas, Muehling, & Weber, 2015) or citizen scientists (Goldenberg, 2016) looking to fill “knowledge gaps” (Brownlie & Howson, 2005), empiric scientific research should not be conflated with searching for answers on the internet where misinformation and “fake news” abound. Although there is disagreement about the extent to which gullibility contributes to BCT (Douglas, Sutton, & Cichocka, 2019), the “gullibility conspiracism hypothesis” is supported by an association between BCT and other implausible beliefs, bullshit receptivity, and cognitive biases such as conjunction fallacies, illusory pattern perception, and hypersensitive agency detection (van Prooijen, 2019). Like BCT, belief in fake news has also been found to be associated with delusion-proneness, bullshit receptivity, and reduced analytic thinking (Bronstein et al., 2019; Pennycook & Rand, 2019, 2020). These findings support the view that mistrust results in an epistemic vacuum that is easily filled by misinformation.
Specific information processing biases may help to explain the diversity and specificity of BCT. For example, recent research has applied the “differential susceptibility to media effects model” (Valkenberg & Peter, 2013) to explore why some individuals are susceptible to “viral deception” and misinformation about science-related conspiracy theories (Landrum, Olshansky, & Richards, 2019). In a series of studies, susceptibility to misinformation about a flat Earth and other scientific conspiracy theories was found to be associated with not only high levels of conspiracist ideation, but also low scientific literacy and in some instances, religiosity (Landrum & Olshansky, 2019; Landrum, Olshansky, & Richards, 2019). In a similar fashion, it is recognized that some conspiracy narratives may be appealing because they follow existing patterns of cultural mistrust as well as racial prejudices and “othering” behaviors within an intergroup context, as discussed previously. According to this perspective, antisemitic conspiracy theories are common because antisemitism is common, with Jewish conspiracy stereotypes an unfortunate part of a “collective heritage of the past” (Kofta & Sedek, 2005) that are utilized as a longstanding “universal explanatory device” (Kofta, Soral, & Bilewicz, 2020).
Exposure to misinformation in the form of conspiracy theories can strengthen BCT through a variety of mechanisms. For example, it has been demonstrated that repeated exposure to misinformation increases belief conviction based on the “illusory truth effect” (Dechêne et al., 2010; Hasher, Goldstein, & Toppino, 1977; Pennycook, Cannon, & Rand, 2018), even when preexposure knowledge is accurate (Fazio et al., 2015). In addition, Raab, Auer, et al. (2013) reported evidence that exposure to extreme or absurd statements contained in conspiracy theories can shift the window of what is regarded as plausible and reduce the salience of official accounts. Several studies have also reported that exposure to conspiracy theories and even discourse about conspiracy theories independent of endorsement can lead to greater mistrust of official views (Einstein & Glick, 2015; Imhoff, Lamberty, & Klein, 2018), suggesting a reciprocal relationship between mistrust and BCT.
Collectively, these findings support the premise that a variety of factors related to individual and social psychology can account for specific conspiracy theory beliefs by impacting the appeal of and vulnerability to specific types of misinformation. Within this socio-epistemic framework, it is therefore necessary to integrate the psychological “problem of misinformation in the head” with the science of misinformation processing as it relates to the “problem of misinformation in the world, where inconsistent information exists across individuals, cultures, and societies” (Seifert, 2017, p. 399).
Conspiracy Theories as Online Misinformation
Although conspiracy theories were common long before the internet with limited evidence to support that BCT has since become more prevalent (Douglas, Uscinski, et al., 2019), there is little doubt that the internet represents a “petri dish” where conspiracy theories can flourish. Additionally, whereas it was relatively easy to distinguish between reliable information and misinformation in the era of print journalism when tabloids were relegated to the grocery store check-out line, reliable and fact-checked information now co-exists online alongside biased editorial opinions and tabloid journalism, all provided free of charge. The hope of this online democratization of knowledge was that the truth would “rise to the top,” but that has not occurred, and many consumers lack the skills or knowledge to determine the reliability of informational sources (Benegal, 2018; Kareklas, Muehling, & Weber, 2015; Lewandowsky, Ecker, & Cook, 2017; Raab, Ortlieb, et al., 2013).
A 2007 study found that belief in 9/11 conspiracy theories was significantly associated with being a consumer of grocery store tabloids and online blogs as opposed to more “legitimate media sources” (Stempel, Hargrove, & Stempel, 2007). Recent research has clarified that when individuals who are looking for answers online gravitate towards conspiracy theories, it is often an active process of self-selection rather than a passive effect (Klein, Clutton, & Dunn, 2019). Consistent with the two-component, socio-epistemic model of BCT, this active search for conspiracy narratives is steered by individual biases related to epistemic mistrust along with other factors such as need for certainty and closure, intergroup prejudices, and lack of analytic thinking. In the presence of such preexisting intuitions and beliefs, confirmation bias appears to figure heavily into the search for “evidence” to support conspiracy theories (van Prooijen & Douglas, 2018).
At the same time however, it has been demonstrated that BCT is more likely when conspiracy theories are “made salient” through exposure, especially among those with a high need for cognitive closure (Marchlewska, Cichocka, & Kossowska, 2018). It has likewise been shown that online misinformation and false or “fake” news tends to travel “farther, faster, deeper, and more broadly” than truth (Vosoughi, Roy, & Aral, 2018, p. 1147). Conspiracy theories in particular seem to offer novel and provocative online content that elicits higher than average engagement on platforms like YouTube where related content is viewed in turn due to automated recommendations (Faddoul, Chaslot, & Farid, 2020). These findings validate the view that conspiracy theories can be understood as self-perpetuating internet “memes” (Pierre, 2019; Varis, 2019). It therefore appears that BCT is not only related to searches for information biased by individual consumer factors, but that those searches are further biased by the availability and salience of conspiracy theories within the architecture of the internet. A modern understanding of how misinformation processing contributes to BCT therefore requires an evolving appreciation of digital forces such as “echo chambers” and “filter bubbles” (Bakshy, Messing, & Adamic, 2015; Del Vicario et al., 2016; Flaxman et al., 2016) that can result in a kind of “confirmation bias on steroids” (Pierre, 2019).
Misinformation Processing in a Post-Truth World
According to the “source credibility model,” the trustworthiness of information depends on the perceived expertise and trustworthiness of its communicator (Kareklas, Muehling, & Weber, 2015). But since trust serves as a “heuristic for competence” (Benegal, 2018), evaluations of expertise and trustworthiness are often one and the same such that without trust, there are no acknowledged experts. Nichols (2014; Nichols & Smith, 2017) has characterized the modern democratization of knowledge that has been facilitated by the internet, whereby uninformed opinions are treated with false equivalence to voices of epistemic authority, as rooted in mistrust that has resulted in the “death of expertise.”
Recently, it has been suggested that “conspiracy” and “theory” have become decoupled within a “new conspiracism” in which conspiracy theories are not so much attempts to explain anything (Rosenblum & Muirhead, 2019) as they are deceptions intent on creating a “post-truth world” (Lewandowsky, Ecker, & Cook, 2017). According to this worldview, “facts no longer matter or are not even acknowledged to exist” and have been replaced by an “alternative epistemology that does not conform to conventional standards of evidentiary support” (Lewandowsky, Ecker, & Cook, 2017, p. 356, 361). When epistemic mistrust becomes so pervasive as to reject all informational sources and abandon the very concept of objective truth, BCT more closely resembles the monological belief system extreme where logically contradictory conspiracy theories can coexist.
Numerous authors have highlighted the role of mass media and the internet in creating a world where it can be a significant challenge to distinguish between reliable information and misinformation (Benegal, 2018; Kareklas, Muehling, & Weber, 2015; Lewandowsky, Ecker, & Cook, 2017; Raab, Ortlieb, et al., 2013), with users who are “prosumers” that simultaneously consume and produce information (Rosselli, Martini, & Bragazzi, 2016). Some conspiracy theories are started or fueled by online “trolls” and “conspiracy theory entrepreneurs” who may have low degrees of actual belief conviction themselves (Bessi et al., 2015; Sunstein & Vermeule, 2009), but are motivated to “erode trust in facts and reality” (Lewandowsky, Ecker, & Cook, 2017, p. 361), whether to sow chaos for its own sake or for profit and political propagandizing. As noted previously, the propagation of conspiracy theories has become a tool of populist movements with vested interests in discrediting “elites” (Bergmann, 2018; Castanho Silva, Vegetti, & Littvay, 2017; Rosenblum & Muirhead, 2019) while foreign governments have weaponized online mechanisms to promote conspiracy theories as a form of cyberterrorism intended to foment discord with democracy (Broniatowski et al., 2018).
Integration and Implications
As with other types of delusion-like beliefs, a broader understanding of conspiracy theories may be achieved through the integration of multiple perspectives straddling the psychological, sociological, information, and political sciences (Pierre, 2019). The two-component, socio-epistemic model of BCT proposed here draws from these perspectives in order to account for the high rate of occurrence of conspiracy theories in the general population, their variance along a continuum of belief conviction, and the inconsistency of research findings attempting to understand them as a kind of individual psychopathology according to the psychological deficit model. In doing so, it provides an overarching framework that integrates both individual psychological and interactive sociocultural factors with implications for mitigation. Figure 1 schematizes a proposed relationship between epistemic mistrust, misinformation processing, and other interactional factors that vary across different conspiracy theories and social circumstances.
Figure 1
The utility of the two-component, socio-epistemic model of BCT is well-illustrated by considering the case example of vaccine-related conspiracy theories. The beliefs of so-called “anti-vaxxers” are varied, but at their core involve the conviction that government agencies worldwide promote and mandate vaccination despite awareness of significant vaccine harms such as causing autism. Several authors have proposed that conspiracy theories related to vaccines, ranging from the more general phenomenon of “vaccine hesitancy” to more extreme conspiracy theories, are best understood as expressions of epistemic mistrust (Goldenberg, 2016; Larson, 2018). While vaccine-related conspiracy theories have been traditionally attributed to an uninformed public according to the “knowledge deficit model,” Goldenberg has noted that they first emerged at a time of simmering public mistrust in the UK’s handling of the bovine spongiform encephalopathy outbreak of the late 1980s (Goldenberg, 2016, in press). Along this pre-existing backdrop, a 1998 case series purporting a link between the trivalent measles, mumps, and rubella (MMR) vaccine and autism was published in a well-respected medical journal, was promoted by an academic institution, and took over a decade to be discredited and fully retracted from publication. To date, belief in a link between the MMR vaccine and autism persists not for lack of exposure to corrective information or the public’s poor understanding of science, but because parents refuse to take the leaps of faith in trusting the medical community and scientific institutions necessary to allay their vaccines fears (Goldenberg, 2016, 2019, in press). Larson et al. (2018) have noted that such mistrust exists at multiple layers including in relation to vaccines and their makers, healthcare professionals, and the public healthcare systems.
With epistemic mistrust of authoritative sources attributed to perceived trust violations and corruption, attempts to dispel myths about vaccines by providing factual counterinformation have not been found to be a particularly effective strategy under experimental conditions, especially in terms of increasing actual intentions to vaccinate where there may be a backfire effect (Jolley & Douglas, 2014b; Nyhan et al., 2014; Nyhan & Reifler, 2015). Vaccine-related conspiracy theories likewise persist in the real world despite subsequent studies repeatedly demonstrating no evidence of any increased risk of autism with vaccination (Hviid et al., 2019; Jain et al., 2015; Taylor, Swerdfeger, & Eslick, 2014). Remarkably, those who believe in vaccine-related conspiracy theories reject such evidence in favor of the placing trust in the fraudulent claims of a lone physician whose original paper was retracted due to falsified data and financial conflicts of interest such that he lost his medical license (Deer, 2011).
BCT related to vaccines has been further bolstered by widely available online misinformation such that it has been claimed that we are currently living in a “golden age of anti-vaccine conspiracies” that are “endemic among anti-vaccination groups” (Stein, 2017, p. 168). A 2010 study found that 71% of top 10 “hits” from Google searches using the keyword “vaccination” linked to anti-vaccination sites (Kata, 2010). Although measures have since been implemented by social media platforms to deprioritize such misinformation appearing in online searches, conspiracy theories related to vaccines and other topics continue to flourish nonetheless (Elkin, Pullon, & Stubbe, 2020; Faddoul, Chaslot, & Farid, 2020). A wide variety of internet and social media sites promote pseudo-scientific tropes about vaccines, elevate anecdotal accounts of harm over objective research, and breed mistrust in authoritative sources of vaccine-related information (Evrony & Caplan, 2017; Kata, 2012). As of 2019, anti-vaccine groups on Facebook outnumbered pro-vaccine groups by more than 2:1 and were growing much faster through interaction with other groups (Johnson et al., 2020). In addition, vaccine misinformation is often either generated or recirculated by internet “bots” and Russian “trolls” with an apparent goal of amplifying debate and further undermining authoritative accounts (Broniatowski et al., 2018; Walter, Ophir, & Jamieson, 2020). The net result is that conspiracy theories about vaccines are ubiquitous on the internet, especially for those looking to fill the informational vacuum created by epistemic mistrust.
Exposure to both explicit conspiracy theories and implicit conspiracy cues have been linked to BCT as well as anti-vaccination attitudes including reduced intentions to vaccinate (Hornsey, Harris, & Fielding, 2018; Jolley & Douglas, 2014b; Lyons, Merola, & Reifler, 2019) such that we are now witnessing the re-emergence of diseases like measles that were previously eradicated in some parts of the world. Vaccine-related conspiracy theories therefore provide a clear example of how epistemic mistrust and exposure to misinformation work synergistically to create BCT, with important practical consequences that underscore the need for evidence-based mitigation strategies. According to the two-component, socio-epistemic model of BCT, interventions must target both mistrust and misinformation.
With respect to reducing epistemic mistrust, it has been argued that trustworthiness must come before trust (O’Neill, 2020) such that mitigating BCT related to vaccines requires that institutions of authority take steps to regain their “social capital” (Lewandowsky, Cook, & Ecker, 2017; Ozawa, Paina, & Qiu, 2016) as “trust builders” (Brownlie & Howson, 2005) before attempting to dispel misinformation. Some evidence suggests that this could be facilitated by institutions of authority being more transparent in disclosing areas of epistemic uncertainty and ceding power to the public by inviting it “in the door” to participate interactively (Goldenberg, in press; Stilgoe, 2016) and by fostering “political efficacy” (Parsons et al., 1999). Others have suggested that the most effective interventions to reduce vaccine hesitancy may depend on one-on-one engagement, listening, and dialogue between healthcare providers and their patients (Goldstein et al., 2015; Larson, 2013, 2018). Gust et al. (2008) found that parents who resolved their vaccine hesitancy by changing their minds in favor of vaccination most frequently did so due to information or assurances from their healthcare provider. A meta-analysis of strategies to address vaccine hesitancy likewise found the most robust support for dialogue-based interventions including the use of social media (Jarrett et al., 2015). Of relevance to such dialogues, it appears that many with BCT find it objectionable to be categorized as “conspiracy theorists” (Wood & Douglas, 2013), suggesting that socially-stigmatizing labels like “anti-vaxxers” or “flat-Earthers” might have the unintended effect of causing individuals with BCT to dig in their heels more deeply, increasing conviction in order to defend against perceived attacks on one’s identity (Wood, 2016).
In terms of correcting misinformation, education about the dangers of infectious disease was found to be effective in reducing vaccine hesitancy in one experimental study (Horne et al., 2015), but the effect was limited to “fence-sitters” who were neither strongly for or against vaccines (Betsch, Korn, & Holtmann, 2015). This suggests that the effectiveness of mitigation strategies may vary according to belief conviction, supporting anecdotal observations that “full blown” BCT, like delusional beliefs, might be more resistant to efforts to either educate or to win back trust. To date, the few studies of mitigation strategies against other types of BCT suggest that “inoculation strategies” that offer pre-emptive warnings about exposure to misinformation and conspiracy theories are among the most evidence-based interventions (Banas & Miller, 2013; Jolley & Douglas, 2017). But like vaccines themselves, the effectiveness of inoculation strategies requires that the intervention “beat misinformation to the punch.” In addition, reliable information and inoculation warnings can be thwarted by counter-inoculation strategies (e.g. inoculations against inoculations) such that efforts to change “hearts and minds” run the risk of being reduced to recursive battles over epistemic trust, with one side endlessly arguing that the other is an unreliable source of information.
Conclusion
In summary, the two-component, socio-epistemic model of BCT offers a potentially normalizing account of conspiracist ideation based on a reciprocal relationship between mistrust and belief in misinformation. Unlike paranoia, epistemic mistrust can be more broadly understood as a psychosocial phenomenon with both individual cognitive and interactional sociocultural determinants that can be both highly specific as well as potentially appropriate to a situation. In order to gain a broader understanding of conspiracy theories, future research exploring the underpinnings of BCT would benefit from consistently measuring epistemic mistrust as a key variable (Larson et al., 2018) while devoting greater attention to the social antecedents of fractured trust relationships and potentially racist origins of intergroup conspiracy theories. While epistemic mistrust and needs for closure or certainty may lead to searches for information that are biased in favor of Manichean narratives reflecting general conspiracist ideation, this preference must be understood in the context of online information processing. Conspiracy theory research would therefore benefit from continuing movement out of the psychology lab into the online spaces where BCT arises.
Although the two-component, socio-epistemic model emphasizes a normalizing account of BCT, it acknowledges the harm that can always accompany belief in misinformation. Accordingly, it emphasizes that effective mitigation requires attention on the part of institutions of epistemic authority to cultivate trust, taking care not to put the “cart” of trust before the “horse” of trustworthiness. This approach mirrors that of cognitive behavioral therapy which requires a therapeutic alliance in order to modify cognitive distortions and misbeliefs by collaboratively analyzing evidence (Cameron, Rodgers, & Dagnan, 2018). Attempting to modify BCT within individuals should begin with questions that ask, “Who do you trust or mistrust and why?” and “How do you decide what to believe?” However, given the ubiquity of conspiracy theories and the fact that most individuals with BCT are not “help-seeking,” effective mitigation strategies may necessitate wholescale approaches that 1) confer resistance against BCT by utilizing inoculation strategies that counter misinformation where it occurs (e.g. online), 2) teach analytic thinking within educational systems at an early age (Marsh & Yang, 2017; Swami et al., 2014), and 3) restructure or otherwise impose restrictions on the digital architectures that distribute information in order to label or curb misinformation and promote “technocognition” (Lewandowsky, Ecker, & Cook, 2017). These are daunting challenges that warrant considerable resources to support interventional research going forward.