Original Research Reports

Science and Politics: Do People Support the Conduct and Dissemination of Politicized Research?

Stephanie M. Anglin*a, Lee Jussimb

Journal of Social and Political Psychology, 2017, Vol. 5(1), 142–172, https://doi.org/10.5964/jspp.v5i1.427

Received: 2014-09-29. Accepted: 2016-11-02. Published (VoR): 2017-03-22.

Handling Editor: Boris Bizumic, Australian National University, Canberra, Australia

*Corresponding author at: Department of Social and Decision Sciences, Carnegie Mellon University, 5000 Forbes Avenue, Pittsburgh, PA, 15213, USA. E-mail: sanglin@andrew.cmu.edu

This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Three studies investigated how ethical people believe it is to suppress politicized research findings and how strongly they support research on politicized topics. In general, participants reported that it is unethical to suppress research findings and that they support the conduct of politicized research, regardless of whether the findings or topics supported or opposed their views. Even so, liberals and conservatives reported that it is less unethical to withhold the publication of research findings that challenge vs. support their views and stronger support for research aligned with their ideology. Politically active participants were especially likely to demonstrate partisan support for science. Together, these findings suggest that although people explicitly endorse the conduct and dissemination of politicized research, their politics still influence their support for research consistent versus inconsistent with their views.

Keywords: motivated reasoning, political psychology, science attitudes, moral judgment, ethical decision-making

People often seek out and evaluate evidence in ways that are partial to their pre-existing views (Kunda, 1990; MacCoun, 1998; Nickerson, 1998). They selectively expose themselves to and easily accept information supporting their views and ignore or dismiss information opposing their beliefs (Kleck & Wheaton, 1967; Koriat, Lichtenstein, & Fischhoff, 1980). People sometimes even believe that science cannot provide answers to questions when the evidence opposes vs. supports their beliefs (Munro, 2010). If people dismiss findings that oppose their views and discount the ability of science to provide answers to questions that challenge their beliefs, people may believe that ideologically threatening findings should not be disseminated and research on topics opposing their beliefs should not be studied at all. The present research investigated whether people believe it is ethical for researchers to refrain from publishing politicized findings, whether they believe withholding the publication of research findings is more ethical when the results oppose vs. support their political views, and whether people believe research topics consistent with their political beliefs are more worthy of investigation than topics inconsistent with their views.

In general, people are skeptical of social science research, believing it is unscientific (Lilienfeld, 2012), less meaningful to society than other disciplines (Janda, England, Lovejoy, & Drury, 1998), and that people should trust their common sense and intuitions more than the results from social science research studies (Lilienfeld, 2012). Although the majority of people hold positive views of science (Funk & Rainie, 2015), they report little trust in the accuracy and reliability of scientific findings (YouGov, 2013). People believe researchers’ ideological views distort the information reported in their studies (YouGov, 2013), attributing results to the researchers’ ideology (Anglin, 2016; MacCoun & Paletz, 2009). Certainly, researchers’ political views sometimes do distort their research (Duarte et al., 2015; Eagly, 1995; Jussim, Crawford, Anglin, & Stevens, 2015; Jussim, Crawford, Anglin, Stevens, & Duarte, 2016; Jussim, Crawford, Stevens, & Anglin, 2016; Jussim, Crawford, Stevens, Anglin, & Duarte, 2016; Redding, 2001). However, people are more likely to view research as biased when the results challenge vs. support their views (Anglin, 2016).

Indeed, people often accept evidence confirming their beliefs at face value while critically evaluating information disconfirming their views. They rate research reports as more convincing, methodologically sound, and worthy of publication when the results support vs. oppose their political views (e.g., Ditto & Lopez, 1992; Edwards & Smith, 1996; Klaczynski, 2000; Klaczynski & Gordon, 1996; Lord, Ross, & Lepper, 1979; MacCoun & Paletz, 2009; Miller, McHoskey, Bane, & Dowd, 1993; Munro & Ditto, 1997; Pomerantz, Chaiken, & Tordesillas 1995; Zuwerink Jacks & Devine, 1996; Zuwerink Jacks & Devine, 2000). When presented with counter-attitudinal evidence, people analyze the information longer, generate more counterarguments, and list more flaws with the reasoning of the argument or the methodology of the research than they do in response to confirmatory evidence (Ditto & Lopez, 1992; Ditto, Scepansky, Munro, Apanovitch, & Lockhart, 1998; Lord et al., 1979; Munro & Ditto, 1997; Taber & Lodge, 2006). Moreover, after reading about research challenging their views, people not only dismiss the results but are more likely to discount the ability of science to provide answers to the research question altogether (Munro, 2010).

Although research on confirmation bias has consistently shown that people more readily accept ideologically congruent vs. incongruent evidence, it is unknown whether people believe researchers should only examine research topics consistent with their views and should refrain from publishing findings opposing their beliefs. Do people think only research topics congruent with their political ideology are important to study, or do they support the investigation of all topics, even when the research question challenges their ideological stance? If people are more likely to trust results from ideologically congruent than ideologically incongruent studies (Anglin, 2016; Ditto & Lopez, 1992; Ditto et al., 1998; MacCoun & Paletz, 2009; Munro, 2010; Munro & Ditto, 1997), they may favor investing resources in ideologically congruent research over ideologically incongruent research. From a Bayesian perspective (Fischhoff & Beyth-Marom, 1983; Koehler, 1993; MacCoun, 1998; Tversky & Kahneman, 1974), supporting the conduct of research consistent with one’s prior views over research inconsistent with one’s prior views may be defensible; though in an ideal world, people would support non-directional research, in which researchers attempt to falsify hypotheses (Popper, 1934) and test competing hypotheses, rather than a single directional hypothesis (Jussim, Crawford, Stevens, Anglin, & Duarte, 2016; Tetlock, 1994). However, even if doubting belief-incongruent findings can be justified, endorsing the suppression of such findings after they have already been obtained is more extreme, less justifiable, and threatening to the integrity of the scientific enterprise. Do people believe it is ethical to refrain from publishing results that oppose their views but believe it is unethical to withhold publication when the findings support their views? Or do people think it is unethical for researchers to refrain from publishing findings, regardless of the political implications of the results? The present research tested these questions by assessing liberals’ and conservatives’ (1) judgments of the ethicality of suppressing left and right-wing research findings and (2) support for the conduct of left and right-wing research.

By examining whether people uphold the advancement of knowledge or their political values when pitted against each other, this research has potential to advance theory and research in several domains, including motivated reasoning, political psychology, moral decision-making, public opinion, and science and politics. Furthermore, researchers have argued that resistance toward certain research findings and uncritical acceptance of others adversely affects decisions and practices in every sector of society, leading to ineffective laws, policies, educational programs, and medical treatments (Lilienfeld, 2012; Teo, 2012). Believing that particular topics should not be studied and that certain findings should not be published poses an even greater obstacle to the progress of science. Such beliefs may influence the topics that get studied and the results that get disseminated, limiting the scope of knowledge, biasing education and training, reducing funding and support for particular research topics, and further compromising the development of effective applications in the public domain. Moreover, failing to support diversity of perspectives and ideas may foster negative attitudes toward science among particular individuals and groups, discouraging them from pursuing scientific study and reducing trust in science.

Motivated Reasoning, Science, and Politics

Research from various literatures (e.g., motivated cognition, moral psychology, political psychology) has repeatedly shown that liberals’ and conservatives’ judgments are influenced by their value motivations to similar degrees (e.g., Crawford, 2012; Ditto et al., 2016; Kahan, Peters, Dawson, & Slovic, 2013; Lord et al., 1979; Taber & Lodge, 2006; Uhlmann, Pizarro, Tannenbaum, & Ditto, 2009). Across party lines, affect often guides reasoning in moral domains, such that people automatically judge information based on whether it is congruent with their morals and values and then seek to justify their affective response with objective arguments and facts (Haidt, 2001; Munro & Ditto, 1997). Based on prior studies demonstrating similar levels of motivated reasoning bias among liberals and conservatives, we expected that conservatives would endorse the ethicality of suppressing left over right-wing research to a similar extent as liberals endorse the ethicality of suppressing right over left-wing research, and conservatives would support right-wing over left-wing research to a similar extent as liberals support left over right-wing research.

Although studies in the motivated reasoning literature repeatedly demonstrate symmetric bias in liberals’ and conservatives’ evaluation of evidence (Ditto et al., 2016), liberals and conservatives have different attitudes toward and experiences with science that may affect whether they both exhibit partisan support for the conduct and dissemination of politicized research. For example, research has shown that conservatives distrust science more than liberals (Gauchat, 2012; YouGov, 2013), possibly because scientists are overwhelmingly liberal (Gross & Simmons, 2007; Klein & Stern, 2009), and in some fields, biased against conservative ideas (Inbar & Lammers, 2012; Redding, 2013) in favor of liberal narratives (Duarte et al., 2015). Indeed, conservatives are more likely than liberals to worry that results from scientific studies are influenced by researchers’ political views (YouGov, 2013). Conservatives attribute liberal findings to the researcher’s liberalism, whereas liberals and conservatives do not always attribute conservative findings to the researcher’s conservatism (MacCoun & Paletz, 2009). Because conservatives distrust science more than liberals, conservatives may believe it is more ethical to suppress left than right-wing research and favor the conduct of right over left-wing research, whereas liberals believe it is equally unethical to suppress left and right-wing research and support the conduct of left and right-wing research equally.

Alternatively, biasi may be observed among liberals but not conservatives. Because scientists are overwhelmingly liberal, they disproportionately study topics with liberal themes (Jussim et al., 2015; Redding, 2001). Not only is science biased toward the left, but increasing evidence supports the existence of a hostile environment toward conservatives in academia. Many non-liberal researchers have reported experiences of discrimination against their ideas from their liberal colleagues (Stevens et al., 2017), and researchers in some fields have admitted that they would discriminate against conservatives and conservative ideas when making publication, grant, hiring, and symposia decisions (Inbar & Lammers, 2012). Because liberal laypeople may also believe that science should continue to operate as a liberal enterprise, liberals may believe it is more ethical to suppress right than left-wing findings and support the conduct of left over right-wing research, whereas conservatives believe it is equally unethical to suppress left and right-wing research and support the conduct of left and right-wing research equally.

At the same time, results may partially support both the symmetric bias hypotheses and either the right or left-wing bias hypotheses. For example, both liberals and conservatives may show partisan endorsement of research suppression and partisan support for science, but conservatives may exhibit more partisan bias than liberals because of their lower support for science. Alternatively, both liberals’ and conservatives’ judgments may be biased by their political views, but liberals may show more partisan bias because they are motivated to maintain science as a liberal enterprise.

Finally, it was also possible that the present research would find no evidence of ideological bias. Liberals and conservatives value fairness equally (Graham, Haidt, & Nosek, 2009), and although people are motivated to defend their beliefs and values, judgment is not completely impaired by motives and goals. In fact, people are also motivated to be accurate (Hart et al., 2009), and many people recognize the complexities surrounding sociopolitical issues and the pros and cons of various political policies (Howard-Pitney, Borgida, & Omoto, 1986; Tetlock, 1986). Moreover, participants in this research were explicitly asked whether they believe it is ethical for researchers to refrain from publishing politicized findings and how strongly they support research on various topics. When asked directly, participants may report that it is unethical to refrain from publishing results and support for all research topics, regardless of their true attitudes, as a result of social desirability biases (Paulhus, 1991).

Hypotheses

Primary Alternative Hypotheses

Based on previous research demonstrating similar levels of motivated reasoning among liberals and conservatives, we expected to find evidence of symmetric bias in this research. We predicted that conservatives would more strongly endorse the ethicality of suppressing research favoring left than right-wing political goals to the same extent that liberals more strongly endorse the ethicality of suppressing research favoring right than left-wing goals (Hypothesis 1a). Likewise, we predicted that conservatives would support right over left-wing research to about the same extent that liberals support left over right-wing research (Hypothesis 1b).

In addition to testing for symmetric bias, we examined five plausible alternative hypotheses stemming from the other research and theory reviewed above. Because conservatives distrust science more than liberals (e.g., Gauchat, 2012), the right-wing bias hypothesis predicted that conservatives would more strongly endorse the ethicality of suppressing research favoring left than right-wing political goals, whereas liberals would be equally unwilling to endorse the ethicality of suppressing research favoring either left or right-wing goals (Hypothesis 2a). A second right-wing bias hypothesis predicted that conservatives would support right over left-wing research, whereas liberals would support both types of research equally (Hypothesis 2b).

Because liberals may desire for science to continue to operate as a liberal enterprise (Inbar & Lammers, 2012; Redding, 2001), the left-wing bias hypothesis predicted that liberals would more strongly endorse the ethicality of suppressing research favoring right vs. left-wing goals, whereas conservatives would be equally unwilling to endorse the ethicality of suppressing research favoring either left or right-wing goals (Hypothesis 3a). A second left-wing bias hypothesis predicted that liberals would support left over right-wing research, whereas conservatives would support both types of research equally (Hypothesis 3b).

Alternatively, both liberals and conservatives may show partisan endorsement of research suppression and partisan support for science, but one group may exhibit more partisan bias than the other. The asymmetric right-wing bias hypothesis predicted that conservatives would more strongly endorse the ethicality of suppressing research favoring left vs. right-wing goals than liberals endorse the ethicality of suppressing research favoring right vs. left-wing goals (Hypothesis 4a). A second asymmetric right-wing bias hypothesis predicted that conservatives would support right over left-wing research more than liberals support left over right-wing research (Hypothesis 4b). The asymmetric left-wing bias hypothesis predicted that liberals would more strongly endorse the ethicality of suppressing right vs. left-wing research than conservatives endorse the ethicality of suppressing left vs. right-wing research (Hypothesis 5a). A second asymmetric left-wing bias hypothesis predicted that liberals would support left over right-wing research more than conservatives support right over left-wing research (Hypothesis 5b).

Lastly, the no ideological bias hypotheses predicted that liberals and conservatives would oppose suppressing research favoring left-wing goals to the same extent they oppose suppressing research favoring right-wing goals (Hypothesis 6a), and would support left-wing research to the same extent they support right-wing research (Hypothesis 6b).

Potential Moderators

Political activism

Some research suggests that individuals with greater knowledge and expertise on a topic are more susceptible to motivated reasoning biases (e.g., Kahan, 2013; Liu, 2016). Therefore, participants higher in political activism may show greater partisan endorsement of research suppression and greater partisan support for research than those lower in political activism. To test this prediction, we examined the possible moderating effect of political activism on ideology in predicting responses to the left and right-wing research suppression and support for science measures.

Right-wing authoritarianism

Right-wing authoritarianism (RWA) is a construct correlated with but distinct from conservatism (Crowson, Thoma, & Hestevold, 2005). Research has shown that right-wing authoritarians readily submit to authority figures and tend to favor right-wing politics, particularly on sociocultural issues (Altemeyer, 1988, 2006; Cornelis & Van Hiel, 2006; Duckitt, 2006). The research suppression questions in this study describe people in positions of power suppressing science to advance political goals on social issues. Therefore, right-wing authoritarians may endorse the ethicality of suppressing research findings, especially results advancing left-wing goals. In addition, because right-wing authoritarians favor right-wing politics, we expected them to support right over left-wing research. The present research tested these predictions. To explore possible differences in responses between conservatives high and low in RWA, as exploratory analyses, we also examined whether RWA interacted with ideology in predicting differences to the left and right-wing measures.

Method

In this research, we developed measures assessing how ethical people believe it is for researchers to refrain from publishing research findings favoring left and right-wing political goals (left and right-wing research suppression) and items assessing support for research with left and right-wing political themes (support for left and support for right-wing research). We also developed items assessing how ethical people believe it is for researchers to withhold the publication of politicized research in general (i.e., without specifying a particular political issue; nonpartisan research suppression) to examine whether people believe it is ethical to refrain from publishing politicized findings, regardless of which political group the results favor. See the Appendix to view the items on each measure.

In addition to the items on these five central measures of interest, participants also answered questions assessing other forms of political bias in the conduct and dissemination of research (e.g., flawed research design, bias and discrimination in hiring, media representation, and government funding and application). Results for these additional measures are available in the Supplementary Materials for interested readers.

We conducted three studies to assess the psychometric soundness of our measures, test the competing hypotheses, and examine the replicability of our findings. Because similar methods were employed across the three studies, we present them together here.

Disclosure Statement

We report all data exclusions, manipulations (there were none), and measures, with the exception of an open-ended question assessing participants’ occupation, which was not analyzed.

Participants

Participants were recruited from Amazon’s Mechanical Turk. All users fluent in English, citizens of the United States, and above the age of 18 were eligible to participate. In Study 1, a total of 123ii participants (54 women, 65 men, 3 transgender, 1 unreported), ranging in age from 18 to 70 (M = 34.19, SD = 13.36), completed the study in exchange for $0.25. In Study 2, a total of 200 participantsiii (98 women, 99 men, 2 transgender, 1 unreported), ranging in age from 18 to 74 (M = 33.94, SD = 12.76), completed the study for $0.40. In Study 3, a total of 401 participantsiv (203 men, 196 women, 2 transgender), ranging in age from 18 to 80 (Mage = 35.02, SD = 14.15), completed the study for $0.40. See Table 1 for frequencies of right-leaning, moderate, and left-leaning participants and the Supplementary Materials (Table S1) for a comparison of the political orientation of the sample to the distribution from the General Social Survey.

Table 1

Frequency of Right-Leaning, Moderate, and Left-Leaning Participants

Ideology Study 1 (N = 123) Study 2 (N = 200) Study 3 (N = 401)
Right-leaning (Ideology = 1 to 3) 28 30 73
Moderate (Ideology = 3.5 to 4.5) 47 69 104
Left-leaning (Ideology = 5 to 7) 48 101 224

Note. The ideology variable was created by averaging participants’ self-reported political orientation (1 = Very conservative, 7 = Very liberal) and political affiliation (1 = Strong Republican, 7 = Strong Democrat) scores. Those with ideology scores below 3.5 were identified as right-leaning, between 3.5 and 4.5 as moderate, and above 4.5 as left-leaning.

Power

The symmetric, left-wing, and right-wing bias hypotheses were tested using mixed-model ANOVAs, comparing liberals’ and conservatives’ responses to the left and right-wing measures (i.e., a within-between subject interaction). A total sample of N = 34 (liberals and conservatives) is necessary to detect medium size effects (f = .25) at 80% power, and N = 200 (liberals and conservatives) to detect small effects (f = .10) at 80% power. Given the unequal number of liberals and conservatives in the subsamples, Study 1 (n = 28 conservatives, n = 48 liberals) and Study 2 (n = 30 conservatives, n = 101 liberals) were underpowered to detect small effects. Study 3, however, was more adequately powered to detect small effects (with n = 73 conservatives and n = 297 liberals). The asymmetric bias hypotheses were tested using planned contrasts comparing the magnitude of the difference in liberals’ and conservatives’ responses to the left and right-wing measures. With unequal groups (i.e., a 3:1 ratio of liberals to conservatives), a total sample size of 170 is needed to detect medium size effects at 80% power, and 1050 is needed to detect small effects. Therefore, all three studies were underpowered to detect small asymmetric bias effects.v

Materials

See the Appendix to view the research suppression and support for science measures. Participants rated all items on 7-point scales. For the research suppression measures, low scores represented believing that suppressing politicized findings is unethical and high scores represented believing research suppression is ethical. On the support for science measures, low scores represented no support for the research topic and high scores indicated strong support.

Left-Wing Research Suppression

We developed three items assessing endorsement of the ethicality of suppressing research findings favoring left-wing political goals (e.g., “A group of researchers hypothesize that racial differences in income can be attributed to factors other than racism. How ethical is it for them to refuse to publish results from their study because the findings suggest that racism explains racial differences in income?”). The reliability of this measure was high (α’s > .80 across all three studies).

Right-Wing Research Suppression

Three items were also developed to assess endorsement of the ethicality of suppressing research findings favoring right-wing political goals (e.g., “A group of researchers hypothesize that gender differences in income can be attributed to sexism. How ethical is it for them to refuse to publish the results from their study because the findings suggest that gender differences in income do not result from sexism?”). The reliability of this measure was high (α’s > .77 across all three studies).

Nonpartisan Research Suppression

An additional three items assessed general endorsement of the ethicality of suppressing politicized research findings (e.g., “How ethical is it for scientists to refuse to publish results from their research that are inconsistent with their political views?”). These items hung together well (α’s > .83 across all three studies).

Support for Left and Right-Wing Research

Five items were developed to measure support for left-wing research (e.g., “How important is it for a scientific journal to publish a research article on the need for alternative energy sources?”; α’s > .83 across all three studies), and five items were developed to measure support for right-wing research (e.g., “How important is it for a scientific journal to publish a research article demonstrating that racial profiling strengthens homeland security?”; α’s > .82 across all three studies).

Ideology Measurevi

Participants reported their political orientation and affiliation on 7-point scales (1 = Very conservative/Strong Republican, 7 = Very liberal/Strong Democrat). These items were averaged to create a single measure of ideology (α’s > .81 across all three studies).

Political Activism

Participants also rated their level of political activism on a scale from 1 (Not at all politically active) to 7 (Very politically active).

Procedure

Study 1

This study was advertised as a survey on science and politics. Participants first completed the nonpartisan research suppression questions, presented in randomized order. Next, participants completed the left and right-wing research suppression items. These items were randomly presented rather than grouped by subscale. After the research suppression items, participants responded to the support for left and support for right-wing research questions. These items were also randomly presented rather than grouped by measure. Last, participants provided their demographic information and indicated the extent to which they found the questions in the study difficult to understand and believed their responses reflected their honest attitudes and opinions.

Study 2

The following procedural changes were made in Study 2. First, we changed the order of the questionnaires to reduce social desirability concerns. The nonpartisan research suppression questions pit science against politics most blatantly. Answering the nonpartisan questions first may have increased participants’ reluctance to endorse the subsequent left and right-wing research suppression items. Therefore, we presented the nonpartisan items after the left and right-wing research suppression questions to eliminate any tendency for responses to the nonpartisan questions to influence responses to the left and right-wing suppression items.

Second, we grouped the items from each measure together rather than randomizing the order of presenting left and right-wing items in order to increase ease of understanding, reduce satisficing, and improve response quality (Krosnick, 1999). The order of presenting the left and right-wing research suppression measures was counterbalanced, as was the order of presenting the support for left and support for right-wing research questions.

Third, we inserted an attention check midway through the study to permit us to determine whether there were differences in responses among participants who seemed to be paying more versus less attention to the questions. Modeling Oppenheimer, Meyvis, and Davidenko (2009), participants were instructed to answer ‘4’ to a series of three questions (“In what year were you born?”, “What is the current year?”, and “How tall are you?”), even though this response would not make sense in the context of the questions. Participants were explicitly told that we were checking to see if they were paying attention. Participants who answered 4 to these questions passed the attention check; those who provided actual responses did not.

Finally, in Study 2, we added two questions assessing general support for scientific research (“To what extent do you support scientific research?” and “How often do you trust the results of scientific studies?”). Participants rated these items on 7-point scales, with higher numbers indicating greater general support for science (α = .74).

Study 3

Study 3 methods were identical to Study 2, except that we included a measure of RWA. Participants completed the RWA scale (Altemeyer, 2006), which contains 22 items assessing the degree to which individuals submit to authority figures, adhere to conventions, and seek to punish those who do not adhere to conventions (e.g., “The ‘old-fashioned ways’ and the ‘old-fashioned values’ still show the best way to live”). Participants rated items on a 9-point scale ranging from -4 (very strongly disagree) to +4 (very strongly agree); higher numbers indicated stronger authoritarianism (α = .96).

Results

Factor Structure

Preliminary Analyses

Skewness and kurtosis estimates were nonsignificant for all indicators, except for one indicator in Study 3 with a kurtosis value > 2 (though < 3). Because this value was close to the cutoff, and because Kline (2011) does not recommend transforming variables with kurtosis values < 10, no transformations were applied. There were no missing values.

Measurement Model

The measurement model was assessed with confirmatory factor analysis using SPSS AMOS. Five factors were specified on the predicted model (left-wing research suppression, right-wing research suppression, non-partisan research suppression, support for left-wing research, and support for right-wing research), which were allowed to correlate in the model. See the Appendix to view the indicators specified on each factor.

Several fit indices (model χ2, RMSEA, CFI, TLI, and CMIN/df) were examined to evaluate the goodness of fit of the measurement model. A nonsignificant χ2 indicates good fit. RMSEA values ≤ .05 indicate good fit, and values between .06 and .08 indicate moderate fit (Hu & Bentler, 1999). CFI and TLI values range from 0 to 1, with values close to 1 demonstrating very good fit (Hu & Bentler, 1999). CMIN/df values < 2 represent good fit, and values between 2 and 3 represent moderate fit. Individual parameters were considered significant at α < .05.

Across all three studies, estimation of the full predicted model demonstrated good overall model fit. Although the model χ2 was significant across all three studies (see Table 2), the values for the other fit indices indicated moderately good to very good model fit (across the three studies, RMSEA = .05-.07, CFI = .94-.96, TLI = .93-.95, CMIN/df = 1.51-2.52; see Table 2). See Table 3 for parameter estimates and the Supplementary Materials (Table S2) for factor loadings for each study.

Table 2

Fit Indices

Model χ2 df RMSEA CFI TLI CMIN/df
Study 1
Full predicted five-factor model 214.71*** 142 .065 .94 .93 1.51
Full three-factor alternative model 558.18*** 149 .150 .63 .64 3.75
Partial predicted three-factor suppression model 36.79* 24 .066 .98 .98 1.53
Partial alternative one-factor suppression model 55.32*** 27 .093 .96 .95 2.05
Partial predicted four-factor model 149.31*** 98 .066 .95 .94 1.52
Partial alternative two-factor model 482.19*** 103 .174 .63 .57 4.68
Study 2
Full predicted five-factor model 232.07*** 142 .056 .96 .95 1.63
Full three-factor alternative model 959.61*** 149 .165 .64 .58 6.44
Partial predicted three-factor suppression model 44.78** 24 .066 .99 .98 1.87
Partial alternative one-factor suppression model 70.50*** 27 .090 .97 .96 2.61
Partial predicted four-factor model 159.83*** 98 .056 .96 .95 1.63
Partial alternative two-factor model 877.27*** 103 .194 .52 .45 8.52
Study 3
Full predicted five-factor model 357.38*** 142 .062 .94 .93 2.52
Full three-factor alternative model 1826.51*** 149 .168 .57 .50 12.26
Partial predicted three-factor suppression model 69.26*** 24 .069 .98 .97 2.89
Partial one-factor suppression model 123.23*** 27 .094 .95 .94 4.56
Partial predicted four-factor model 272.20*** 98 .067 .94 .93 2.78
Partial alternative two-factor model 1216.83*** 103 .164 .61 .55 11.81
Table 3

Parameter Estimates for Full Predicted Five-Factor Model

Factor relationships Unstandardized estimate SE Standardized estimate
Study 1
Left-wing suppression ↔ Right-wing suppression 1.62*** 0.25 .97
Left-wing suppression ↔ Nonpartisan suppression 1.46*** 0.24 .90
Left-wing suppression ↔ Support for left-wing research -0.52** 0.17 -.38
Left-wing suppression ↔ Support for right-wing research -0.60** 0.16 -.48
Right-wing suppression ↔ Nonpartisan suppression 1.56*** 0.26 .88
Right-wing suppression ↔ Support for left-wing research -0.50** 0.18 -.34
Right-wing suppression ↔ Support for right-wing research -0.57* 0.17 -.41
Nonpartisan suppression ↔ Support for left-wing research -0.38* 0.17 -.26
Nonpartisan suppression ↔ Support for right-wing research -0.42** 0.16 -.32
Support for left-wing research ↔ Support for right-wing research 0.81*** 0.19 .72
Study 2
Left-wing suppression ↔ Right-wing suppression 1.59*** 0.21 .92
Left-wing suppression ↔ Nonpartisan suppression 1.60*** 0.21 .91
Left-wing suppression ↔ Support for left-wing research -0.36** 0.13 -.24
Left-wing suppression ↔ Support for right-wing research -0.25* 0.11 -.19
Right-wing suppression ↔ Nonpartisan suppression 1.69*** 0.21 .96
Right-wing suppression ↔ Support for left-wing research -0.60*** 0.14 -.39
Right-wing suppression ↔ Support for right-wing research -0.27* 0.12 -.20
Nonpartisan suppression ↔ Support for left-wing research -0.54*** 0.14 -.35
Nonpartisan suppression ↔ Support for right-wing research -0.20 0.11 -.15
Support for left-wing research ↔ Support for right-wing research 0.61*** 0.13 .53
Study 3
Left-wing suppression ↔ Right-wing suppression 1.15*** 0.11 .86
Left-wing suppression ↔ Nonpartisan suppression 1.15*** 0.11 .86
Left-wing suppression ↔ Support for left-wing research -0.32*** 0.08 -.25
Left-wing suppression ↔ Support for right-wing research -0.39*** 0.09 -.27
Right-wing suppression ↔ Nonpartisan suppression 1.19*** 0.11 .93
Right-wing suppression ↔ Support for left-wing research -0.43*** 0.08 -.36
Right-wing suppression ↔ Support for right-wing research -0.29*** 0.08 -.21
Nonpartisan suppression ↔ Support for left-wing research -0.41*** 0.08 -.34
Nonpartisan suppression ↔ Support for right-wing research -0.28*** 0.08 -.21
Support for left-wing research ↔ Support for right-wing research 0.83*** 0.10 .64

*p < .05. **p <.01. ***p < .001.

Measurement Model Comparisons

Several additional models were examined as alternative models. First, a three-factor model was examined by specifying all research suppression items on a single factor and the support for left-wing research and support for right-wing research items on separate factors. Across the three studies, this three-factor model fit the data poorly (χ2’s > 558.18, RMSEA = .15-.17, CFI = .57-.64, TLI = .50-.64, CMIN/df = 3.75-12.26; see Table 2). Chi-square difference tests indicated superior fit of the predicted model across the studies (see Table 4). To provide a more direct test of whether the left, right, and nonpartisan research suppression items were best specified on a single factor or separate factors, a three-factor model including only the suppression factors was compared to a single factor model with all research suppression items specified to load on a single factor (the support for left and right-wing research items were not included in this model). The partial predicted three-factor suppression model had superior fit to the partial alternative single factor suppression model in all three studies (see Tables 2 and 4Table 4).

Table 4

Nested Model Comparisons

Study χ2diff dfdiff
3-factor model - 5-factor model
Study 1 343.47*** 7
Study 2 727.54*** 7
Study 3 1469.13*** 7
Partial 1-factor suppression model - Partial predicted 3-factor suppression model
Study 1 18.53*** 3
Study 2 25.72*** 3
Study 3 53.97*** 3
Partial alternative 2-factor model - Partial predicted 4-factor model
Study 1 332.88*** 5
Study 2 717.44*** 5
Study 3 944.63*** 5

***p < .001.

We aimed to develop items assessing how ethical people believe it is to suppress politicized research and their support for research on politicized topics. Although we defined research suppression and support for research as different constructs, it is possible that the right-wing research suppression items hung together with the support for left-wing research items and the left-wing research suppression items hung together with the support for right-wing research items. To test whether the data supported the consideration of the left and right-wing research suppression and support for research items as separate constructs, we compared a four-factor model specifying the left and right-wing research suppression and support for research items on separate factors to a two-factor model in which we specified the right-wing research suppression and support for left-wing research items on one factor and the left-wing research suppression items and support for right-wing research items on another factor (the nonpartisan research suppression items were not included in this model). The partial predicted four-factor model model had superior fit to the partial alternative two-factor model in all three studies (see Tables 2 and 4Table 4).

Because the predicted five-factor model had good overall model fit across all three studies and better model fit than the alternative models, the five subscales were retained to test the alternative hypotheses.

Descriptive Statistics

Table 5 presents means and SDs for each variable, and Tables S3, S4, and S5 (Supplementary Materials) report correlations among all variables included in each study. Overall, participants reported that it is unethical to suppress (nonpartisan, left-wing, and right-wing) research findings and moderate support for left and right-wing research (see Table 5).vii

Table 5

Means and Standard Deviations for Measures

Measure Study 1
Study 2
Study 3
M SD M SD M SD
Ideology 4.36 1.45 4.68 1.50 4.75 1.55
Right-leaning 2.32 0.63 2.13 0.68 2.23 0.73
Moderate 4.10 0.31 3.99 0.31 4.09 0.29
Left-leaning 5.80 0.69 5.90 0.71 5.88 0.73
Nonpartisan research suppression 2.31 1.43 2.41 1.43 1.98 1.20
Left-wing research suppression 2.22 1.38 2.59 1.40 2.17 1.20
Right-wing research suppression 2.33 1.39 2.69 1.44 2.26 1.16
Support for left-wing research 4.94 1.31 4.85 1.29 5.09 1.27
Support for right-wing research 4.85 1.26 4.57 1.23 4.81 1.31
Political activism 3.69 1.72 3.53 1.63 3.62 1.75
Education 4.10 1.46 4.18 1.54 4.05 1.49
Age 34.19 13.36 33.94 12.76 35.02 14.15
Religiosity 3.76 2.10 3.47 2.12 3.30 2.20
Perceived question difficulty 2.97 1.76 2.81 1.71 2.21 1.40
Perceived honesty 6.02 1.02 5.80 1.35 6.18 1.13
General support for science 5.21 1.13 5.49 1.01
RWA -1.82 1.66

Attention Check

An attention check was included in Studies 2 and 3 to assess whether responses to the study measures differed between participants who appeared to be paying more versus less attention. In Study 2, 74% of the sample passed the attention check; in Study 3, 85% of the sample passed the attention check. When excluding participants who failed the attention check, one analysis dropped to nonsignificance (i.e., the analysis testing for differences in liberals’ and conservatives’ endorsement of left and right-wing research suppression in Study 2). However, the overall sample size was smaller in Study 2 than in Study 3, and a greater percentage of the sample failed the attention check in Study 2 than in Study 3. Therefore, excluding participants who failed the attention check more strongly reduced the power in Study 2 than in Study 3. In addition, participants who failed the attention check were more likely to identify as conservative than were those who passed the attention check in Study 2, t(198) = 2.03, p = .04, and thus excluding participants who failed the attention check may have also weakened the results of this study by reducing the number of conservatives in the sample (from 30 to 19). Participants who failed the attention check did not differ in ideology from those who passed the attention check in Study 3; in fact, the results appeared to be stronger when excluding participants who failed the attention check in this study. Because the analyses testing the alternative hypotheses produced the same overall pattern of results, regardless of whether participants who failed the attention check were included or not, all participants in each sample were included in the analyses.

Main Analyses

Ideology and Left-Wing vs. Right-Wing Research Suppression

To test the hypotheses concerning differences between liberals’ and conservatives’ responses to the left and right-wing research suppression measures, we performed a series of mixed-model ANOVAs, with research suppression (left-wing vs. right-wing research suppression)viii as a within-subjects factor and ideology (left-leaning vs. right-leaning vs. moderate participants)ix as a between-subjects factor.

Study 1

In Study 1, there was a main effect for ideology, F(2, 120) = 3.31, p = .04, η = .07. Moderates (M = 2.66, SD = 1.44) reported that it is less unethical to withhold the publication of left and right-wing research findings than did left-leaning participants (M = 2.01, SD = 1.14); however, moderates’ and left-leaning participants’ endorsement of research suppression did not significantly differ from right-leaning participants’ overall endorsement (M = 2.10, SD = 1.32). The main effect for research suppression was nonsignificant, F(1, 120) = 1.87, p = .17, η = .03, but there was a research suppression x ideology interaction, F(2, 120) = 4.43, p = .01, η = .22. Consistent with the first left-wing bias hypothesis (Hypothesis 3a), whereas there was no difference in right-leaning participants’ endorsement of left-wing (M = 2.08, SD = 1.42) and right-wing research suppression (M = 2.12, SD = 1.33), t(27) = 0.25, p = .81, d = 0.05, left-leaning participants reported that it is less unethical to suppress right-wing (M = 2.18, SD = 1.33) than left-wing findings (M = 1.83, SD = 1.08), t(47) = 2.92, p = .005, d = 0.44. There was no significant difference in moderates’ endorsement of left (M = 2.71, SD = 1.49) and right-wing research suppression (M = 2.61, SD = 1.46), t(46) = -1.06, p = .29, d = 0.16 (see Table 6 and Figure 1).

Click to enlarge
jspp.v5i1.427-f1
Figure 1

Endorsement of withholding the publication of research findings as a function of research suppression (left-wing research suppression vs. right-wing research suppression) and ideology (left-leaning vs. right-leaning vs. moderate participants). Higher values represent stronger endorsement of withholding the publication of research findings. Error bars denote one standard error above and below the mean.

Table 6

Mean Scores on Measures for Left-Leaning, Moderate, and Right-Leaning Participants

Measure Statistic Study 1
Study 2
Study 3
Aggregate
Left-
leaning
Moderate Right-
leaning
Left-
leaning
Moderate Right-
leaning
Left-
leaning
Moderate Right-
leaning
Left-
leaning
Moderate Right-
leaning
Left-wing/right-wing research supressiona
Left-wing M 1.83a 2.71a 2.08a 2.39a 2.78a 2.80a 2.06a 2.35a 2.24 2.12a 2.56a 2.34a
SD (1.08) (1.49) (1.42) (1.30) (1.42) (1.63) (1.11) (1.26) (1.33) (1.17) (1.37) (1.44)
Right-wing M 2.18b 2.61a 2.12a 2.60b 2.90a 2.49a 2.27b 2.45a 1.98 2.35b 2.62a 2.12b
SD (1.33) (1.46) (1.33) (1.35) (1.57) (1.43) (1.14) (1.24) (1.10) (1.23) (1.40) (1.24)
Left vs. right d 0.44 0.16 0.05 0.24 0.14 0.31 0.25 0.12 0.23 0.27 0.07 0.21
Nonpartisan research supressionb
Nonpartisan M 2.05a 2.60a 2.27a 2.22a 2.59a 2.66a 1.92a 2.16a 1.90a 2.02a 2.39b 2.15ab
SD (1.24) (1.44) (1.68) (1.22) (1.58) (1.66) (1.16) (1.28) (1.18) (1.19) (1.42) (1.44)
Support for left-wing/right-wing researcha
Left-wing M 5.35a 4.91a 4.31a 5.22a 4.64a 4.09a 5.43a 4.93a 4.28a 5.36a 4.83a 4.24a
SD (1.17) (1.27) (1.38) (1.22) (1.20) (1.35) (1.09) (1.15) (1.51) (1.14) (1.19) (1.44)
Right-wing M 4.78b 4.87a 4.94b 4.58b 4.47a 4.73b 4.74b 4.77a 5.05b 4.70b 4.70b 4.95b
SD (1.37) (1.22) (1.17) (1.24) (1.12) (1.44) (1.27) (1.26) (1.48) (1.27) (1.21) (1.40)
Left vs. right d 0.44 0.05 0.56 0.49 0.18 0.38 0.63 0.17 0.60 0.56 0.14 0.53
General support for scienceb
General support M 5.57a 4.96b 4.60b 5.81a 5.34b 4.73c 5.74a 5.19b 4.69c
SD (0.93) (1.13) (1.36) (0.68) (0.98) (1.38) (0.77) (1.06) (1.37)

Note. Scores on each measure ranged from 1 to 7. Those with ideology scores below 3.5 were identified as right-leaning, between 3.5 and 4.5 as moderate, and above 4.5 as left-leaning. Higher scores on the nonpartisan, left-wing, and right-wing research suppression measures indicate stronger endorsement of the ethicality of suppressing research findings to advance political goals. Higher scores on support for left-wing research, support for right-wing research, and general support for science represent greater support for research. Standard deviations are presented in parentheses. The aggregate scores represent mean scores on the research suppression and support for research measures across the three studies. Aggregate scores for general support for science represent mean scores across Studies 2 and 3. Effect size estimates (Cohen’s d) represent the strength of the difference between scores on the left and right-wing measures for left-leaning, moderate, and right-leaning participants.

aSubscripts indicate which pairwise mean comparisons (between the left and right-wing measures) statistically differed (i.e., vertical comparisons, conducted separately for left-leaning, moderate, and right-leaning participants). b Subscripts indicate which means statistically differed among left-leaning, moderate, and right-leaning participants (i.e., horizontal comparisons).

Study 2

In Study 2, the main effects for ideology, F(2, 197) = 1.32, p = .27, η = .11, and research suppression, F(1, 197) = .01, p = .94, η = .00, were nonsignificant, but the interaction was significant, F(2, 197) = 3.86, p = .02, η = .06. In partial support of the first symmetric bias hypothesis (Hypothesis 1a), left-leaning participants reported that it is less unethical to suppress right-wing (M = 2.60, SD = 1.35) vs. left-wing findings (M = 2.39, SD = 1.30), t(100) = 2.38, p = .02, d = 0.24, whereas right-leaning participants tended to report that it is less unethical to suppress left-wing (M = 2.80, SD = 1.63) vs. right-wing findings (M = 2.49, SD = 1.43), t(29) = -1.66, p = .11, d = 0.31. There was no significant difference in moderates’ endorsement of left-wing (M = 2.78, SD = 1.42) and right-wing research suppression (M = 2.90, SD = 1.57), t(68) = 1.14, p = .26, d = 0.14.

To test the asymmetric bias hypotheses, we performed the linear contrast: [(Left-leaning participants’ right-wing research suppression – Left-leaning participants’ left-wing research suppression) – (Right-leaning participants’ left-wing research suppression – Right-leaning participants’ right-wing research suppression)]. This contrast was nonsignificant, F(1, 129) = 0.57, ns., indicating that left and right-leaning participants did not significantly differ in their endorsement of suppressing research findings to advance their political goals. Thus, these results provide support for the first symmetric bias hypothesis: left-leaning participants’ endorsement of right-wing over left-wing research suppression (2.60 - 2.39 = 0.21) was similar to right-leaning participants’ endorsement of left-wing over right-wing research suppression (2.80 - 2.49 = 0.31).

Study 3

In Study 3, the ANOVA revealed no main effects for ideology, F(2, 398) = 2.06, p = .13, η = .10, or research suppression, F(1, 398) = 0.09, p = .76, η = .01, but a significant interaction between the two, F(2, 398) = 7.48, p = .001, η = .08. In partial support of the first symmetric bias hypothesis (Hypothesis 1a), left-leaning participants reported that it is less unethical to suppress right-wing (M = 2.27, SD = 1.14) vs. left-wing findings (M = 2.06, SD = 1.11), t(223) = 3.69, p < .001, d = 0.25, whereas right-leaning participants tended to report that it is less unethical to suppress left-wing (M = 2.24, SD = 1.33) vs. right-wing findings (M = 1.98, SD = 1.10), t(72) = -1.92, p = .059, d = 0.23. There was no significant difference in moderates’ endorsement of left-wing (M = 2.35, SD = 1.26) and right-wing research suppression (M = 2.45, SD = 1.24), t(103) = 1.24, p = .22, d = 0.12.

To test the asymmetric bias hypotheses, we performed the linear contrast: [(Left-leaning participants’ right-wing research suppression – Left-leaning participants’ left-wing research suppression) – (Right-leaning participants’ left-wing research suppression – Right-leaning participants’ right-wing research suppression)]. This contrast was nonsignificant, F(1, 295) = 0.33, ns., because left-leaning (2.27 - 2.06 = .21) and right-leaning (2.24 - 1.98 = 0.26) participants expressed nearly identical endorsement of the ethicality of suppressing research findings to advance their politics.

Aggregate

Given the varying power of each study, in order to draw appropriate conclusions, it is important to consider the overall pattern of findings across the three studies. For the research suppression measures, Study 1 supported the left-wing bias hypothesis and Studies 2 and 3 supported the symmetric bias hypothesis. To examine the overall effects across all three studies, the data sets for each study were merged, and effect sizes for the difference between scores on the left and right-wing measures were calculated for left-leaning, moderate, and right-leaning participants (see Table 6). The aggregate data set contained 724 participants in total (131 right-leaning, 220 moderate, and 373 left-leaning participants).

Across the three studies, left and right-leaning participants expressed similar levels of partisan bias on the research suppression measures (see Table 6 and Figure 2), though these effects were relatively small (d = 0.21 for right-leaning and d = 0.27 for left-leaning participants). Although moderates reported that it is less unethical to withhold the publication of politicized research findings than did left and right-leaning participants, they expressed similar endorsement of left and right-wing research suppression (d = 0.07).

Click to enlarge
jspp.v5i1.427-f2
Figure 2

Aggregate scores on the research suppression and support for science measures across the three studies as a function of measure type (left vs. right-wing) and ideology (left-leaning vs. right-leaning vs. moderate participants). Error bars denote one standard error above and below the mean.

Political Activism

A general linear model (GLM) analysis was performed to test the possible moderating effect of political activism on partisan endorsement of research suppression; ideology and political activism were included as between-subjects factors and research suppression (right-wing vs. left-wing) was included as a within-subjects factor. Across the three studies, political activism did not interact with ideology in predicting differences in responses to the left and right-wing research suppression measures, F(2, 712) = 0.82, p = .44, η = .02.

Right-Wing Authoritarianism

A second GLM was performed to examine overall differences in responses to the left and right-wing research suppression measures as a function of RWA, along with a possible interactive effect between RWA and ideology in predicting responses. In this analysis, ideology and RWA were included as between-subject factors and research suppression (right-wing vs. left-wing) was included as a within-subjects factor. This analysis only included Study 3 participants because RWA was only measured in Study 3.

As noted above, ideology interacted with research suppression, F(2, 395) = 4.31, p = .01, η = .06. RWA also interacted with research suppression, F(1, 395) = 4.12, p = .04, η = .04. To explicate the RWA x research suppression interaction, a difference score variable was computed, suppression difference, by subtracting left-wing research suppression from right-wing research suppression; positive scores on this variable indicated believing it is less unethical to suppress right-wing than left-wing research, negative scores indicated believing it is less unethical to suppress left-wing than right-wing research, and 0 represented equal endorsement of left and right-wing research suppression. RWA was weakly negatively correlated with suppression difference, r = -.17, p = .001, indicating that higher RWA was associated with believing it is less unethical to suppress left-wing than right-wing research.

There was also a significant three-way interaction among ideology, RWA, and research suppression, F(2, 395) = 5.10, p = .006, η = .06. Simple effects revealed a significant RWA x research suppression interaction among right-leaning, F(1, 71) = 6.83, p = .01, η = .29, but not moderate, F(1, 102) = 1.16, p = .28, η = .10, or left-leaning participants, F(1, 222) = 0.50, p = .48, η = .05. Among right-leaning participants, RWA was negatively correlated with suppression difference, r = -.30, p = .01, indicating that higher RWA was associated with believing it is less unethical to suppress left vs. right-wing research among right-leaning participants.

Although RWA was associated with stronger endorsement of left vs. right-wing research suppression, RWA was positively correlated with both left, r = .42, p < .001, and right-wing suppression, r = .30, p < .001; individuals higher in right-wing authoritarianism reported that it is less unethical to suppress both left and right-wing research findings than did individuals lower in RWA.

Ideology and Nonpartisan Research Suppression

For each study, a one-way ANOVA was performed as an exploratory analysis to test for differences in liberals’, conservatives’, and moderates’ endorsement of nonpartisan research suppression (i.e., endorsing withholding the publication of research findings for political reasons, regardless of the political stance at stake). This analysis was nonsignificant in Study 1, F(2, 120) = 1.76, p = .18, η = .17, Study 2, F(2, 197) = 1.95, p = .15, η = .14, and Study 3, F(2, 398) = 1.69, p = .19, η = .09 (for means, see Table 6). However, when analyzing the three data sets together, the ANOVA became significant, F(2, 721) = 5.63, p = .004, η = .12. Moderates (M = 2.39, SD = 1.42) more strongly endorsed nonpartisan research suppression than did left-leaning participants (M = 2.02, SD = 1.19), though the difference in nonpartisan research suppression did not differ between right-leaning (M = 2.15, SD = 1.44) and left-leaning, and right-leaning and moderate, participants. In addition, nonpartisan research suppression was positively associated with RWA, r = .42, p < .001, indicating that higher RWA was associated with believing it is less unethical to suppress politicized research findings, regardless of the political stance at stake.

Ideology and Support for Left-Wing vs. Support for Right-Wing Research

Mixed-model ANOVAs were also conducted to examine liberals’ and conservatives’ responses to the support for research measures. In these analyses, support for research (support for left vs. support for right-wing research) was included as a within-subjects factor and ideology (left-leaning vs. right-leaning vs. moderate participants) as a between-subjects factor.

Study 1

In Study 1, the main effects for ideology, F(2, 120) = 1.31, p = .27, η = .13, and support for science, F(1, 120) = 0.00, p = .97, η = .00, were nonsignificant. However, there was a significant ideology x support for science interaction, F(2, 120) = 10.16, p < .001, η = .18. In support of the second symmetric bias hypothesis (Hypothesis 1b), left-leaning participants supported left-wing research (M = 5.35, SD = 1.17) over right-wing research (M = 4.78, SD = 1.37), t(47) = 3.01, p = .004, d = .44, and right-leaning participants supported right-wing research (M = 4.94, SD = 1.17) over left-wing research (M = 4.31, SD = 1.38), t(27) = -2.89, p = .007, d = 0.56, (see Table 6 and Figure 3). Moderates did not differ in their support for left (M = 4.91, SD = 1.27) and right-wing research (M = 4.87, SD = 1.22), t(46) = 0.31, p = .76, d = 0.05.

Click to enlarge
jspp.v5i1.427-f3
Figure 3

Support for research as a function of support for research measure (support for left-wing research vs. support for right-wing research) and ideology (left-leaning vs. right-leaning vs. moderate participants). Higher values represent stronger support for research. Error bars denote one standard error above and below the mean.

Although the ANOVA revealed partisan support for research among both left and right-leaning participants, one group may have exhibited more partisan support for research than the other. The asymmetric bias hypotheses were tested using the linear contrast [Left-leaning participants’ support for left-wing research – Left-leaning participants’ support for right-wing research) – (Right-leaning participants’ support for right-wing research – Right-leaning participants’ support for left-wing research)]. This contrast was nonsignificant, F(1, 75) = 0.10, ns. Left-leaning (5.35 - 4.78 = 0.57) and right-leaning (4.94 - 4.31 = 0.63) participants did not significantly differ in the extent to which they favored research supporting their politics. Overall, therefore, Study 1 demonstrated symmetric partisan support for research.

Study 2

In Study 2, there was a main effect for ideology, F(2, 197) = 3.56, p = .03, η = .16. Left-leaning participants tended to report greater overall support for left and right-wing research (M = 4.90, SD = 1.04) than did moderates (M = 4.55, SD = 1.06) and right-leaning participants (M = 4.41, SD = 1.12); however, none of the pairwise comparisons reached statistical significance. The main effect for support for science was nonsignificant, F(1, 197) = .27, p = .60, η = .02, but there was an ideology x support for science interaction, F(2, 197) = 12.35, p < .001, η = .18. Consistent with the second symmetric bias hypothesis (Hypothesis 1b), right-leaning participants reported greater support for right-wing (M = 4.73, SD = 1.44) vs. left-wing research (M = 4.09, SD = 1.35), t(29) = -2.12, p = .04, d = 0.38, whereas left-leaning participants reported greater support for left-wing (M = 5.22, SD = 1.22) vs. right-wing research (M = 4.58, SD = 1.24), t(100) = 4.88, p < .001, d = 0.49. There was also a tendency for moderates to report greater support for left-wing (M = 4.64, SD = 1.20) vs. right-wing research (M = 4.47, SD = 1.12), but this difference did not reach statistical significance, t(68) = 1.49, p = .14, d = 0.18.

The asymmetric bias hypotheses were tested using the linear contrast: [(Left-leaning participants’ support for left-wing research – Left-leaning participants’ support for right-wing research) – (Right-leaning participants’ support for right-wing research – Right-leaning participants’ support for left-wing research)]. This contrast was nonsignificant, F(1, 129) = 0.00, ns. Left-leaning (5.22 - 4.58 = 0.64) and right-leaning (4.73 - 4.09 = 0.64) participants expressed identical preferences for research favoring their politics, supporting the second symmetric bias hypothesis.

Study 3

In Study 3, the ANOVA revealed a main effect for ideology, F(2, 398) = 4.46, p = .01, η = .13. Left-leaning participants reported stronger overall support for both left and right-wing research (M = 5.09, SD = 1.05) than did right-leaning participants (M = 4.66, SD = 1.35); moderates’ support for politicized research (M = 4.85, SD = 1.11) did not significantly differ from either group. The main effect for support for science was nonsignificant, F(1, 398) = 0.16, p = 0.69, η = .01, but there was an ideology x support for science interaction, F(2, 398) = 49.29, p < .001, η = .21. In support of the second symmetric bias hypothesis (Hypothesis 1b), right-leaning participants reported greater support for right (M = 5.05, SD = 1.48) vs. left-wing research (M = 4.28, SD = 1.51), t(72) = -5.10, p < .001, d = 0.60, whereas left-leaning participants reported greater support for left (M = 5.43, SD = 1.09) vs. right-wing research (M = 4.74, SD = 1.27), t(223) = 9.37, p < .001, d = 0.63. Moderates also tended to report slightly greater support for left (M = 4.93, SD = 1.15) vs. right-wing research (M = 4.77, SD = 1.26), t(103) = 1.68, p = .10, d = 0.17.

The asymmetric bias hypotheses were tested using the linear contrast: [(Left-leaning participants’ support for left-wing research – Left-leaning participants’ support for right-wing research) – (Right-leaning participants’ support for right-wing research – Right-leaning participants’ support for left-wing research)]. This contrast was nonsignificant, F(1, 295) = 0.58, ns., because left-leaning (5.43 - 4.74 = 0.69) and right-leaning (5.05 - 4.28 = 0.77) participants expressed similar levels of support for research aligned with their politics.

Aggregate

All three studies supported the symmetric bias hypothesis for the support for research measures (see Table 6 and Figure 2). The differences in support for left and right-wing research were moderate in strength for both left-leaning (d = 0.56) and right-leaning (d = 0.53) participants. Moderates tended to support left-wing research over right-wing research, but the strength of the difference was small (d = 0.14).

Political activism

To test whether participants higher in political activism reported greater partisan support for science than those lower in political activism, a GLM analysis was performed. Ideology (left-leaning vs. moderate vs. right-leaning) and political activism were included as between-subjects factors and research suppression (right-wing vs. left-wing) was included as a within-subjects factor. This analysis yielded a three-way interaction among ideology, support for science, and political activism, F(2, 712) = 7.31, p = .001, η = .07.

To explicate the interaction, a difference score variable was computed, support difference, by subtracting support for right-wing research from support for left-wing research; positive scores on this variable indicated greater support for left-wing than right-wing research, negative scores indicated greater support for right-wing than left-wing research, and 0 represented equal support for left and right-wing research. Correlations were then examined between political activism and support difference for left-leaning, moderate, and right-leaning participants. Political activism was negatively associated with support difference for right-leaning participants, r(131) = -.19, p = .03, positively associated with support difference for left-leaning participants, r(368) = .14, p = .007, and unrelated to support difference for moderates, r(219) = -.05, p = .44. Consistent with research suggesting that individuals with greater knowledge and expertise are more susceptible to motivated reasoning, among right-leaning participants, greater political activism was associated with stronger support for right than left-wing research, and among left-leaning participants, greater political activism was associated with stronger support for left than right-wing research.

RWA

A GLM was performed to examine overall differences in responses to the support for left and right-wing measures as a function of RWA, along with the possible interactive effect between RWA and ideology in predicting responses. In this analysis, ideology and RWA were included as between-subject factors and support for research (right-wing vs. left-wing) was included as a within-subjects factor. This analysis only included Study 3 participants because RWA was only measured in Study 3.

As noted above, ideology interacted with support for science, F(2, 395) = 8.07, p < .001, η = .08. RWA also interacted with support for science, F(1, 395) = 42.81, p < .001, η = .14. To explicate the interaction, the correlation between RWA and support difference (support for left-wing research – support for right-wing research) was computed. RWA was strongly negatively correlated with support difference, r = -.51, p = .001; higher RWA was associated with stronger support for right over left-wing research. (The simple correlations between RWA and support for left-wing research, r = -.49, p < .001, and RWA and support for right-wing research, r = .00, indicate that RWA was associated with lower support for left-wing research but was unrelated to support for right-wing research; see Tables S3 and S4, Supplementary Materials). The three-way interaction among ideology, RWA, and support for science was nonsignificant, F(2, 395) = 0.69, p = .50, η = .02.

Ideology and General Support for Science

In both studies that assessed general support for science, ideology was positively correlated with general support for science, r’s > .40, p’s < .001, indicating that left-leaning participants were more supportive of scientific research in general than were right-leaning participants. RWA was negatively related to general support for science, r = -.49, p < .001, indicating that higher RWA was associated with lower general support for science.

Correlations Among Measures

Surprisingly, across all three studies, nonpartisan, left-wing, and right-wing research suppression were all strongly positively correlated with each other, r’s ≥ .69, p’s < .001 (see Tables S3-S5, Supplementary Materials). These findings suggest that there are powerful individual differences in willingness to endorse research suppression, regardless of the political implications of the findings. In fact, considering the pattern of low means on the research suppression variables (see Table 5), it appears that most participants viewed withholding the publication of research findings as unethical. Similarly, in all three studies, support for right-wing research was positively correlated with support for left-wing research, r’s > .44, p’s < .001, suggesting that, to a large extent, participants' support for left and right-wing research co-occurred regardless of the political implications of the research. Across the three studies, the research suppression measures were negatively correlated with the support for research measures (see Tables S3-S5, Supplementary Materials), indicating that participants who opposed suppressing politicized findings more strongly supported (both left and right-wing) research.

The strong correlations among the primary measures indicate the presence of unmeasured variables accounting for such relationships. Education does not appear to explain these strong relationships (see Tables S3-S5, Supplementary Materials), but RWA was moderately correlated with all three research suppression measures, even more strongly than general support for science in Study 3 (see Table S5, Supplementary Materials). For further discussion of the relationships among the study measures and other demographic variables measured in this research, see the Supplementary Materials.

General Discussion

Endorsement of the Advancement of Scientific Knowledge, Irrespective of the Outcome

The present research investigated how strongly people endorse withholding the publication of research findings and support the conduct of research consistent and inconsistent with their political views. Across three studies, participants generally believed that it is unethical to suppress politicized research findings. Endorsing the (un)ethicality of suppressing left-wing research was more strongly correlated with endorsing the (un)ethicality of suppressing right-wing research than with political ideology. That is, participants’ self-reported beliefs about how ethical it is to suppress politicized research findings influenced their responses to a greater extent than did political biases. Similarly, participants’ support for left-wing research was strongly correlated with their support for right-wing research; those who reported stronger support for research on left-wing topics also reported stronger support for research on right-wing topics. In other words, participants tended to support the conduct and dissemination of politicized research, irrespective of the outcome. In addition, participants who more strongly opposed the suppression of politicized findings more strongly supported the conduct of politicized research.

Symmetric Political Bias

Although the results suggest that people support the conduct and dissemination of politicized research, irrespective of the outcome, they also provide support for the symmetric bias hypotheses. Overall, liberals and conservatives reported that it is less unethical to suppress research findings that oppose vs. support their views and partisan support for research on topics consistent with their beliefs (see Table 6). Tetlock (1986) has argued that in tradeoff situations, the relative importance of an individual’s conflicting values determines which value he or she will sacrifice. Our results suggest that, explicitly, people generally uphold the advancement of scientific knowledge and objectivity over political motives and goals, though their values do influence the extent to which they oppose suppressing research findings for politics.

Drawing together multiple areas of research, our findings support previous research indicating that liberals and conservatives value fairness equally (Graham et al., 2009), but that their moral judgments are influenced by their political motivations, such that both groups display similar levels of ideological bias in their reasoning (Crawford, 2012; Kahan et al., 2013; Lord et al., 1979; Taber & Lodge, 2006; Uhlmann et al., 2009). Furthermore, previous research has shown that individuals’ values bias their evaluation of research, such that people find research supporting their beliefs more convincing than research challenging their views (see Kunda, 1990, for a review). Our findings suggest that people’s values not only bias their evaluation of research but also influence the research they believe is important to conduct and disseminate. Politically active participants showed greater partisan support for science than less politically active participants, suggesting that politically active individuals may weigh the importance of advancing political goals relative to advancing science differently than less politically active individuals.

Moderates’ Endorsement of Politicized Research Suppression

Although the main predictions in this research pertained to liberals and conservatives, an interesting pattern of results was observed among moderates. Moderates reported that it is equally unethical to suppress left-wing and right-wing research, but they believed it is less unethical to suppress politicized research findings overall than did liberals and conservatives. Participants classified as moderate likely included individuals with a range of political views, as research suggests that moderates vary in their views on social and economic issues and their degree of partisanship and engagement (Hawkins & Nosek, 2012; Klar, 2014a, 2014b). Moderates in this research were significantly less politically active (M = 3.09, SD = 1.63) than left (M = 3.81, SD = 1.67) and right-leaning (M = 3.92, SD = 1.77) participants, F(2, 715) = 15.47, p < .001. Moderate participants may have been especially likely to believe that politicized findings are distorted by researchers’ views, and that disseminating such findings will only increase political polarization and conflict, rather than advance scientific knowledge.

Conservatism and Support for Science

Although liberals and conservatives demonstrated similar levels of partisan endorsement of research suppression and partisan support for research, consistent with public opinion polls (e.g., Gauchat, 2012), conservatives in this research reported lower overall support for science than did liberals. Future research is necessary to determine whether conservatives are less supportive than liberals of scientific research because there is something about a conservative ideology that conflicts with the philosophy of science, because conservatives see scientists as liberals with liberal agendas, or because some areas of science have a history of favoring liberal questions (Redding, 2001) and discriminating against conservative theories and ideas (Inbar & Lammers, 2012; Stevens et al., 2017).

The results for right-wing authoritarianism were similar to the results for ideology in that RWA, like conservatism, was associated with stronger endorsement of left vs. right-wing research suppression, stronger support for right vs. left-wing research (though unrelated to support for right-wing research), and lower support for science in general. However, partisan endorsement of research suppression was observed more strongly among conservatives higher vs. lower in RWA. Moreover, unlike ideology, higher RWA was associated with more strongly endorsing the ethicality of suppressing left-wing, right-wing, and nonpartisan research findings. Because participants higher in RWA did not strongly support politicized or non-politicized research, these individuals may not have viewed research suppression as unethical as did other participants.

Limitations and Directions for Future Research

Although this research provides new insights into the nature of laypeople’s support for science, certain limitations qualify those insights. In our samples, there was an unequal distribution of liberals and conservatives. The distribution was skewed toward the political left (see Table 1 and Table S1, Supplementary Materials), as are most Mechanical Turk samples compared to national probability samples (Berinsky et al., 2012). Although the liberals and conservatives in our samples reported similar levels of extremity in their ideological views (see Table 5), there were two to three times the number of liberals as conservatives in each study, and thus there were more liberals with extreme views than conservatives with extreme views. Mechanical Turk samples are more politically and ethnically diverse than typical college student (Buhrmester, Kwang, & Gosling, 2011) and convenience samples (Berinsky, Huber, & Lenz, 2012), and produce reliable results in political science (Berinsky et al., 2012) and judgment and decision-making research (Paolacci, Chandler, & Ipeirotis, 2010). However, it is unclear whether liberals and conservatives would show similar levels of bias in samples containing equal representation of partisans with extreme views. In addition, because MTurk samples likely have greater exposure to scientific research than the general public, and our study was advertised as a survey on science and politics, participants in the present research may hold different attitudes toward science and politics than the general population. In order to draw broadly generalizable conclusions regarding liberals’ and conservatives’ support for politicized research, future research should employ more representative samples. Even so, the present findings are consistent with research on representative samples showing that conservatives distrust science more than liberals (Gauchat, 2012).

Our samples were also limited to the context of U.S. politics. Although there is some consistency in liberal and conservative beliefs across cultures (Graham et al., 2009), it is unclear whether and how the results would differ across countries. To examine the generalizability of the findings, our questions could be adapted to reflect appropriate partisan issues in different cultures and subcultures. Across cultures, people with more extreme ideological views may show greater partisan endorsement of research suppression and partisan support for research. Indeed, the motivated reasoning literature suggests that people with stronger beliefs (Miller et al., 1993; Zuwerink Jacks & Devine, 1996; Zuwerink Jacks & Devine, 2000) and those with more knowledge and expertise on a topic (Kahan, 2013; Liu, 2016) may be especially susceptible to motivated reasoning biases. Although the present research found that political activism was associated with greater partisan support for science but not partisan endorsement of research suppression, it is possible that ideological extremity would be a stronger predictor of partisan endorsement of research suppression. To test this question, future research should compare highly partisan and less partisan groups.

Additionally, it is important to note that, to develop the items, we inductively generated lists of topics researchers may study on which liberals and conservatives commonly hold different stances. The lists were not comprehensive lists of all the issues on which liberals and conservatives differ, and because of the within-subjects nature of the study design, a different set of issues was used for the left-wing items than for the right-wing items. Most questions on the left-wing and right-wing research suppression and support for science measures pertained to social issues (though some items did capture issues with economic aspects, such as welfare, healthcare, and taxes). Because political psychologists sometimes distinguish between economic and social liberalism and conservatism (e.g., Crawford, Jussim, Cain, & Cohen, 2013; Duckitt, 2006), future research may benefit from extending the present research to investigate endorsement of research suppression to promote economic vs. social political goals. Moreover, replicating the present findings using a between-subjects design (i.e., one in which participants receive only one of two sets of identically constructed left and right-wing items) is critical to drawing strong conclusions regarding the symmetrical degree of bias across party lines.

Another limitation to this research was the direct nature in which the questions were asked. Participants may report that it is unethical to withhold the publication of research findings and that they support research regardless of the political implications, not because they truly hold these beliefs, but because of social desirability biases. Participants may have responded in the way they believed was appropriate, even though they actually more strongly supported withholding the publication of ideologically threatening research findings. Indeed, participants knew they were participating in a research study and might have inferred that the researchers conducting the study believe it is unethical to withhold the dissemination of research findings. Mechanical Turk users may have been especially inclined to provide socially desirable responses as they tend to pay more attention to study tasks than do in-lab participants (Berinsky et al., 2012). Alternatively, people may in fact believe that withholding research findings for politics is unethical, but their explicit beliefs may not always correspond to their behavior. Certainly, it is easier to assert that withholding research findings is unethical when asked directly than it is to remain objective when actually evaluating research decisions. In previous studies where participants evaluated research reports, participants found research supporting their political views more convincing and methodologically sound than research opposing their views (e.g., Lord et al., 1979; Munro & Ditto, 1997). Thus, although people may generally believe that research suppression is unethical, more subtle attempts to assess political bias indicate that people’s politics sometimes do compromise their judgments.

Because the goal of this research was to examine whether people explicitly endorse withholding the publication of politicized research findings and support the conduct of research on politicized topics, the direct nature of the questions was intentional. It enabled an assessment of whether people explicitly endorse the conduct and dissemination of politicized research, a question that, to date, has not been investigated directly. These measures revealed that participants generally endorse the conduct and dissemination of politicized research, regardless of whether the topic or results favored their political group, but that they endorse the conduct of research on ideologically congruent topics more than ideologically incongruent topics and oppose withholding the dissemination of ideologically congruent findings more than ideologically incongruent findings. Thus, juxtaposing the present results with prior research suggests that people’s political biases influence their support for science subtly and unintentionally rather than explicitly and deliberatively.

Conclusion

The present research provides several unique contributions to understanding perceptions of science, ethical decision-making, motivated reasoning, and political psychology. Whereas previous research has shown that people distrust social science research (Lilienfeld, 2012; YouGov, 2013) and discount science’s ability to provide answers to a question when exposed to research findings on that topic challenging their views (Munro, 2010), the present research sought to determine whether people believe politicized research should be conducted and disseminated at all, and whether their support for politicized research varies as a function of whether the topics support vs. oppose their political beliefs. In general, participants reported that it is unethical for scientists to withhold the publication of politicized findings and that politicized research is important to conduct, irrespective of whether the findings or topics favored left or right-wing politics. Nonetheless, laypeople’s own political values, motives, and goals influenced their judgments, even on our questionnaires which were quite explicit about asking people to consider whether particular research findings should be suppressed and particular topics are worthy of investigation. Taken together, our findings suggest that laypeople's values influence how they perceive research programs, findings, and decisions, even though they seem to know that they should not.

Notes

i) We use the term bias to refer to judgments regarding the ethicality of suppressing left and right-wing research and the importance of conducting left and right-wing research that favor one’s political group.

ii) 14 additional participants completed part of Study 1 but dropped out before reporting their political ideology and thus were excluded from the analyses.

iii) 29 additional participants completed part of Study 2 but dropped out before reporting their ideology and thus were excluded from the analyses.

iv) 36 additional participants completed part of Study 3 but dropped out before reporting their ideology and thus were excluded from the analyses.

v) However, the absence of even small non-significant trends in the data supporting either asymmetric bias hypothesis suggests that the lack of power to detect small effects is not a major concern.

vi) The same pattern of results emerged when using the single political orientation item as a measure of ideology as when using this composite measure of ideology.

vii) Gender and age differences were examined as exploratory analyses and are reported in the Supplementary Materials.

viii) Because the main hypotheses pertained to differences between the left and right-wing research suppression measures, the nonpartisan research suppression measure was not included in these analyses. However, the mixed-model ANOVA including the nonpartisan measure is available in the Supplementary Materials for interested readers.

ix) Participants with ideology scores below 3.5 were identified as right-leaning, between 3.5 and 4.5 as moderate, and above 4.5 as left-leaning.

Funding

This research was partially funded by the Aresty Research Center at Rutgers University.

Competing Interests

The authors have declared that no competing interests exist.

Acknowledgments

The authors would like to thank Urvashi Shukla for her assistance with this research and the reviewers for their invaluable feedback on this work.

References

  • Altemeyer, B. (1988). Enemies of freedom: Understanding right-wing authoritarianism. San Franciso, CA, USA: Jossey Bass.

  • Altemeyer, B. (2006). The authoritarians. Winnipeg, MA, CAN: University of Manitoba.

  • Anglin, S. M. (2016). The psychology of science: Motivated processing of scientific evidence, awareness, and consequences (Unpublished doctoral dissertation). Rutgers University, New Brunswick, NJ, USA.

  • Berinsky, A. J., Huber, G. A., & Lenz, G. S. (2012). Evaluating online labor markets for experimental research: Amazon.com’s Mechanical Turk. Political Analysis, 20, 351-368. https://doi.org/10.1093/pan/mpr057

  • Buhrmester, M., Kwang, T., & Gosling, S. D. (2011). Amazon’s Mechanical Turk: A new source of inexpensive, yet high-quality, data? Perspectives on Psychological Science, 6, 3-5. https://doi.org/10.1177/1745691610393980

  • Cornelis, I., & Van Hiel, A. (2006). The impact of cognitive styles on authoritarianism based conservatism and racism. Basic and Applied Social Psychology, 28, 37-50. https://doi.org/10.1207/s15324834basp2801_4

  • Crawford, J. T. (2012). The ideologically objectionable premise model: Predicting biased political judgments on the left and right. Journal of Experimental Social Psychology, 48, 138-151. https://doi.org/10.1016/j.jesp.2011.10.004

  • Crawford, J. T., Jussim, L., Cain, T. R., & Cohen, F. (2013). Right-wing authoritarianism and social dominance orientation differentially predict biased evaluations of media reports. Journal of Applied Social Psychology, 43, 163-174. https://doi.org/10.1111/j.1559-1816.2012.00990.x

  • Crowson, H. M., Thoma, S. J., & Hestevold, N. (2005). Is political conservatism synonymous with authoritarianism? The Journal of Social Psychology, 145, 571-592. https://doi.org/10.3200/SOCP.145.5.571-592

  • Ditto, P., Clark, C., Liu, B., Wojcik, S., Chen, E., Grady, R., & Zinger, J. (2016, January). At least bias is bipartisan: A meta-analytic comparison of selective interpretation bias in liberals and conservatives. Paper presented at the Society for Personality and Social Psychology Convention, San Diego, CA, USA.

  • Ditto, P. H., Scepansky, J. A., Munro, G. D., Apanovitch, A. M., & Lockhart, L. K. (1998). Motivated sensitivity to preference-inconsistent information. Journal of Personality and Social Psychology, 75, 53-69. https://doi.org/10.1037/0022-3514.75.1.53

  • Ditto, P. H., & Lopez, D. F. (1992). Motivated skepticism: Use of differential decision criteria for preferred and nonpreferred conclusions. Journal of Personality and Social Psychology, 63, 568-584. https://doi.org/10.1037/0022-3514.63.4.568

  • Duarte, J. L., Crawford, J. T., Stern, C., Haidt, J., Jussim, L., & Tetlock, P. E. (2015). Political diversity will improve social psychological science. Behavioral and Brain Sciences, 38, Article e130.

  • Duckitt, J. (2006). Differential effects of right wing authoritarianism and social dominance orientation on outgroup attitudes and their mediation by threat from competitiveness to outgroups. Personality and Social Psychology Bulletin, 32, 684-696. https://doi.org/10.1177/0146167205284282

  • Eagly, A. H. (1995). The science and politics of comparing women and men. The American Psychologist, 50, 145-158. https://doi.org/10.1037/0003-066X.50.3.145

  • Edwards, K., & Smith, E. E. (1996). A disconfirmation bias in the evaluation of arguments. Journal of Personality and Social Psychology, 71, 5-24. https://doi.org/10.1037/0022-3514.71.1.5

  • Fischhoff, B., & Beyth-Marom, R. (1983). Hypothesis evaluation from a Bayesian perspective. Psychological Review, 90, 239-260. https://doi.org/10.1037/0033-295X.90.3.239

  • Funk, C., & Rainie, L. (2015). Public and scientists’ views on science and society. Pew Research Center. Retrieved from http://www.pewinternet.org/2015/01/29/public-and-scientists-views-on-science-and-society

  • Gauchat, G. (2012). Politicization of science in the public sphere: A study of public trust in the United States, 1974 to 2010. American Sociological Review, 77, 167-187. https://doi.org/10.1177/0003122412438225

  • Graham, J., Haidt, J., & Nosek, B. A. (2009). Liberals and conservatives rely on different sets of moral foundations. Journal of Personality and Social Psychology, 96, 1029-1046. https://doi.org/10.1037/a0015141

  • Gross, N., & Simmons, S. (2007). The social and political views of American professors. Unpublished manuscript, Department of Sociology, Harvard University, Cambridge, MA, USA.

  • Haidt, J. (2001). The emotional dog and its rational tail: A social intuitionist approach to moral judgment. Psychological Review, 108, 814-834. https://doi.org/10.1037/0033-295X.108.4.814

  • Hart, W., Albarracín, D., Eagly, A., Brechan, I., Lindberg, M. J., & Merrill, L. (2009). Feeling validated versus being correct: A meta-analysis of selective exposure to information. Psychological Bulletin, 135, 555-588. https://doi.org/10.1037/a0015701

  • Hawkins, C. B., & Nosek, B. A. (2012). Motivated independence? Implicit party identity predicts political judgments among self-proclaimed independents. Personality and Social Psychology Bulletin, 38, 1437-1452. https://doi.org/10.1177/0146167212452313

  • Howard-Pitney, B., Borgida, E., & Omoto, A. M. (1986). Personal involvement: An examination of processing differences. Social Cognition, 4, 39-57. https://doi.org/10.1521/soco.1986.4.1.39

  • Hu, L., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6, 1-55. https://doi.org/10.1080/10705519909540118

  • Inbar, Y., & Lammers, J. (2012). Political diversity in social and personality psychology. Perspectives on Psychological Science, 7, 496-503. https://doi.org/10.1177/1745691612448792

  • Janda, L. H., England, K., Lovejoy, D., & Drury, K. (1998). Attitudes toward psychology relative to other disciplines. Professional Psychology, Research and Practice, 29, 140-143. https://doi.org/10.1037/0735-7028.29.2.140

  • Jussim, L., Crawford, J. T., Anglin, S. M., & Stevens, S. T. (2015). Ideological bias in social psychological research. In J. Forgas, K. Fiedler, & W. D. Crano (Eds.), Social psychology and politics (pp. 91-109). New York, NY, USA: Taylor & Francis.

  • Jussim, L., Crawford, J. T., Anglin, S. M., Stevens, S. T., & Duarte, J. L. (2016). Interpretations and methods: Towards a more effectively self-correcting social psychology. Journal of Experimental Social Psychology, 66, 116-133. https://doi.org/10.1016/j.jesp.2015.10.003

  • Jussim, L., Crawford, J. T., Stevens, S. T., & Anglin, S. M. (2016). The politics of social psychological science: Distortions in the social psychology of intergroup relations. In P. Valdesolo & J. Graham (Eds.), Social psychology of political polarization (pp. 165-196). New York, NY, USA: Routledge.

  • Jussim, L., Crawford, J. T., Stevens, S. T., Anglin, S. M., & Duarte, J. L. (2016). Can high moral purposes undermine scientific integrity? In J. Forgas, L. Jussim, & P. van Lange (Eds.), The social psychology of morality. New York, NY, USA: Routledge.

  • Kahan, D. M. (2013). Ideology, motivated reasoning, and cognitive reflection. Judgment and Decision Making, 8, 407-424.

  • Kahan, D. M., Peters, E., Dawson, E. C., & Slovic, P. (2013). Motivated numeracy and enlightened self-government. Unpublished manuscript. https://doi.org/10.2139/ssrn.2319992

  • Klaczynski, P. A. (2000). Motivated scientific reasoning biases, epistemological beliefs, and theory polarization: A two-process approach to adolescent cognition. Child Development, 71, 1347-1366. https://doi.org/10.1111/1467-8624.00232

  • Klaczynski, P. A., & Gordon, D. H. (1996). Self-serving influences on adolescents’ evaluations of belief-relevant evidence. Journal of Experimental Child Psychology, 62, 317-339. https://doi.org/10.1006/jecp.1996.0033

  • Klar, S. (2014a). Identity and engagement among political independents in America. Political Psychology, 35, 577-591. https://doi.org/10.1111/pops.12036

  • Klar, S. (2014b). A multidimensional study of ideological preferences and priorities among the American public. Public Opinion Quarterly, 78, S1344-359. https://doi.org/10.1093/poq/nfu010

  • Kleck, R. E., & Wheaton, J. (1967). Dogmatism and response to opinion-consistent and opinion-inconsistent information. Journal of Personality and Social Psychology, 5, 249-252. https://doi.org/10.1037/h0024197

  • Klein, D. B., & Stern, C. (2009). Groupthink in academia: Majoritarian departmental politics and the professional pyramid. Independent Review, 13, 585-600.

  • Kline, R. B. (2011). Principles and practice of structural equation modeling (3rd ed.). New York, NY, USA: The Guilford Press.

  • Koehler, J. J. (1993). The influence of prior beliefs on scientific judgments of evidence quality. Organizational Behavior and Human Decision Processes, 56, 28-55. https://doi.org/10.1006/obhd.1993.1044

  • Koriat, A., Lichtenstein, S., & Fischhoff, B. (1980). Reasons for confidence. Journal of Experimental Psychology, 6, 107-118.

  • Krosnick, J. A. (1999). Survey research. Annual Review of Psychology, 50, 537-567. https://doi.org/10.1146/annurev.psych.50.1.537

  • Kunda, Z. (1990). The case for motivated reasoning. Psychological Bulletin, 108, 480-498. https://doi.org/10.1037/0033-2909.108.3.480

  • Lilienfeld, S. O. (2012). Public skepticism of psychology: Why many people perceive the study of human behavior as unscientific. The American Psychologist, 67, 111-129. https://doi.org/10.1037/a0023963

  • Liu, B. S. (2016, January). What you think you know: Knowledge, attitudes, and biased evaluation of science. Poster presented at the Society for Personality and Social Psychology Convention, San Diego, CA, USA.

  • Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence. Journal of Personality and Social Psychology, 37, 2098-2109. https://doi.org/10.1037/0022-3514.37.11.2098

  • MacCoun, R. J. (1998). Biases in the interpretation and use of research results. Annual Review of Psychology, 49, 259-287. https://doi.org/10.1146/annurev.psych.49.1.259

  • MacCoun, R. J., & Paletz, S. (2009). Citizens’ perceptions of ideological bias in research on public policy controversies. Political Psychology, 30, 43-65. https://doi.org/10.1111/j.1467-9221.2008.00680.x

  • Miller, A. G., McHoskey, J. W., Bane, C. M., & Dowd, T. G. (1993). The attitude polarization phenomenon: Role of response measure, attitude extremity, and behavioral consequences of reported attitude change. Journal of Personality and Social Psychology, 64, 561-574. https://doi.org/10.1037/0022-3514.64.4.561

  • Munro, G. D. (2010). The scientific impotence excuse: Discounting belief-threatening scientific abstracts. Journal of Applied Social Psychology, 40, 579-600. https://doi.org/10.1111/j.1559-1816.2010.00588.x

  • Munro, G. D., & Ditto, P. H. (1997). Biased assimilation, attitude polarization, and affect in reactions to stereotype-relevant scientific information. Personality and Social Psychology Bulletin, 23, 636-653. https://doi.org/10.1177/0146167297236007

  • Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2, 175-220. https://doi.org/10.1037/1089-2680.2.2.175

  • Oppenheimer, D. M., Meyvis, T., & Davidenko, N. (2009). Instructional manipulation checks: Detecting satisficing to increase statistical power. Journal of Experimental Social Psychology, 45, 867-872. https://doi.org/10.1016/j.jesp.2009.03.009

  • Paolacci, G., Chandler, J., & Ipeirotis, P. G. (2010). Running experiments on Amazon Mechanical Turk. Judgment and Decision Making, 5, 411-419.

  • Paulhus, D. L. (1991). Measurement and control of response biases. In J. P. Robinson, P. R. Shaver, & L. S. Wrightsman (Eds.), Measures of personality and social psychological attitudes. San Diego, CA, USA: Academic Press.

  • Pomerantz, E. M., Chaiken, S., & Tordesillas, R. S. (1995). Attitude strength and resistance processes. Journal of Personality and Social Psychology, 69, 408-419. https://doi.org/10.1037/0022-3514.69.3.408

  • Popper, K. (1934). The logic of scientific discovery. New York, NY, USA: Routledge.

  • Redding, R. E. (2001). Sociopolitical diversity in psychology. The American Psychologist, 56, 205-215. https://doi.org/10.1037/0003-066X.56.3.205

  • Redding, R. E. (2013). Politicized science. Society, 50, 439-446. https://doi.org/10.1007/s12115-013-9686-5

  • Stevens, S. T., Jussim, L., Anglin, S. M., Lai, C., Contrada, R., Welch, C. A., . . . Campbell, W. K., (2017). Political exclusion and discrimination in social psychology: Lived experiences and solutions. Manuscript submitted for publication.

  • Taber, C. S., & Lodge, M. (2006). Motivated skepticism in the evaluation of political beliefs. American Journal of Political Science, 50, 755-769. https://doi.org/10.1111/j.1540-5907.2006.00214.x

  • Teo, T. (2012). Psychology is still a problematic science and the public knows it. The American Psychologist, 67, 807-808. https://doi.org/10.1037/a0030084

  • Tetlock, P. E. (1986). A value pluralism model of ideological reasoning. Journal of Personality and Social Psychology, 50, 819-827. https://doi.org/10.1037/0022-3514.50.4.819

  • Tetlock, P. E. (1994). Political psychology or politicized psychology: Is the road to scientific hell paved with good moral intentions? Political Psychology, 15, 509-529. https://doi.org/10.2307/3791569

  • Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185, 1124-1131. https://doi.org/10.1126/science.185.4157.1124

  • Uhlmann, E. L., Pizarro, D. A., Tannenbaum, D., & Ditto, P. H. (2009). The motivated use of moral principles. Judgment and Decision Making, 4, 476-491.

  • YouGov. (2013). Americans have little faith in scientists, science journalists: Poll. Retrieved from http://big.assets.huffingtonpost.com/tabs_HP_science_20131209.pdf

  • Zuwerink Jacks, J. R., & Devine, P. G. (1996). Attitude importance and resistance to persuasion: It’s not just the thought that counts. Journal of Personality and Social Psychology, 70, 931-944. https://doi.org/10.1037/0022-3514.70.5.931

  • Zuwerink Jacks, J. R., & Devine, P. G. (2000). Attitude importance, forewarning of message content, and resistance to persuasion. Basic and Applied Social Psychology, 22, 19-29. https://doi.org/10.1207/S15324834BASP2201_3

Appendix: Research Suppression and Support for Science Measures

Nonpartisan research suppression

  1. How ethical is it for the editor of a scientific journal to refuse to publish a study because its results are inconsistent with the editor’s political views?

  2. How ethical is it for scientists to refuse to publish results from their research that are inconsistent with their political views?

  3. How ethical are researchers who selectively report only those results that are consistent with their political ideology?

Right-wing research suppression

  1. The editor of a scientific journal personally supports public welfare programs. How ethical is it for the editor to refuse to publish a study because its results demonstrate that public welfare programs decrease incentives to work?

  2. A group of researchers hypothesize that gender differences in income can be attributed to sexism. How ethical is it for them to refuse to publish the results from their study because the findings suggest that gender differences in income do not result from sexism?

  3. How ethical are researchers studying the benefits of teaching evolution in schools who omit evidence they find of the disadvantages from their research reports?

Left-wing research suppression

  1. The editor of a scientific journal personally supports capital punishment. How ethical is it for the editor to refuse to publish a study because the results suggest that capital punishment fails to deter crime?

  2. A group of researchers hypothesize that racial differences in income can be attributed to factors other than racism. How ethical is it for them to refuse to publish results from their study because the findings suggest that racism explains racial differences in income?

  3. How ethical are researchers studying the health benefits of religion who omit evidence of negative effects of religion on health from their research reports?

Support for left-wing research

  1. How important is it for a scientific journal to publish a research article on the need for alternative energy sources?

  2. How important is it for researchers to report evidence they find of the benefits of teaching evolution in schools?

  3. How useful is a research study investigating the pervasiveness of racism in society?

  4. How useful is a research study examining the negative outcomes associated with religiosity?

  5. How valuable are literature reviews on the benefits of stem cell research?

Support for right-wing research

  1. How important is it for a scientific journal to publish a research article demonstrating that racial profiling strengthens homeland security?

  2. How important is it for researchers to report data suggesting that lowering taxes increases incentives to work and save?

  3. How useful is a research study demonstrating that capital punishment successfully deters crime?

  4. How useful is a research study examining the disadvantages of instituting universal health care?

  5. How valuable are literature reviews on the disadvantages of public welfare programs?