Political Bullshit Receptivity and its Correlates: A Cross-Country Validation of the Concept

Philosophers conceptualized bullshit as persuasive communication that has no regard for truth, knowledge, or evidence. In psychology, research mostly investigated pseudo-profound bullshit, but no study has examined bullshit in the political context. In the present research, we operationalized political bullshit receptivity as endorsing vague political statements, slogans, and political bullshit programs. We investigated the relationship of these three measures with pseudo-profound bullshit, ideology (political ideology, support for neoliberalism), populism, and voting behavior. Three pre-registered studies conducted in early 2020 in the U.S., Serbia, and the Netherlands (total N = 534) yielded medium to high intercorrelations between political bullshit measures and pseudo-profound bullshit, and acceptable construct validity (hypothesized one-factor solution). A meta-analysis on these three studies showed that all political bullshit measures positively correlated with support for the free market, while only some positively correlated with social (political statements and programs) and economic conservatism (programs), and populism (programs). In the U.S., the Netherlands, and all countries combined, increased receptivity to political bullshit was associated with a higher probability to vote for right-wing candidates/parties (e.g., Trump in the U


Background
Politicians can say many things.Sometimes they tell the truth, sometimes they do not.But sometimes, they tell things that seem completely meaningless.And not in a sense that one would disagree with what they say.But rather that these statements mean nothing and everything.For example, consider politicians who state something about believing in people or country, or promising to fight for a better future or dignity.What do such statements mean?Would you agree or disagree with them?It is hard to tell because they are very abstract.Although ubiquitous, this very vague communication-which we termed "political bullshit"-has not been studied before.
sents a collection of abstract words that have syntactical structure, but no actual intended meaning (e.g., "Wholeness quiets infinite phenomena").They demonstrated that endorsement of this type of bullshit is related to lower analytical and higher intuitive thinking style, higher religiosity, and paranormal beliefs.These findings were later replicated and extended, showing that people susceptible to pseudo-profound bullshit are also more prone to conspiratorial and schizotypal ideations (Hart & Graether, 2018), or unusual experiences and magical thinking (Bainbridge, Quinlan, Mar, & Smillie, 2019).
Given that pseudo-profound bullshit might be only one type of bullshit, " [a] point on what could be considered a spectrum of bullshit" (Pennycook et al., 2015, p. 550), researchers attempted to measure other instances of bullshit; for example, Evans and colleagues (2020) measured scientific bullshit (assembling random words from physics glossary), while Čavojová, Brezina, and Jurkovič (2022) attempted to measure general bullshit (i.e., no abstract buzzwords).Endorsement of both types of bullshit showed to be highly correlated with the receptivity to pseudo-profound bullshit (rs = .60and .83respectively).Another empirical line of research on bullshit focused on bullshitting as an act (i.e., producer), rather than on the bullshit as a product (i.e., receptivity).For example, Gligorić and Vilotijević (2020a) demonstrated that people perceive pseudo-profound bullshit as more profound when attributed to a famous author (e.g., Einstein).Furthermore, Littrell, Risko, and Fugelsang (2021) focused on the frequency with which individuals engage in bullshitting, measuring two aspects of it: persuasive and evasive.While the formers' function is to persuade, impress and achieve acceptance of others, the latter aims to avoid giving direct answers and evade any thorough inquiry.However, the dominant approach remains the one on the reception of pseudo-profound bullshit.Importantly, previous research investigated the relationship with political affinity.

Pseudo-Profound Bullshit and Political Affinities
Several studies discovered the relationship between political ideological stances and receptivity to pseudo-profound bullshit.These showed that, in the U.S., neoliberals (i.e., supporters of the free-market politics; Sterling, Jost, & Pennycook, 2016), Republican supporters, and conservatives are more susceptible to pseudo-profound bullshit (Pfattheicher & Schindler, 2016).The findings for the free-market supporters, economic and social conservatism were later replicated (Evans et al., 2020), though again in a U.S. sample.This relationship fits well within ideological asymmetries between conservatives and liberals in epistemic motivation so that conservatives rely more on intuitive thinking style, heuristic, and simple information processing (e.g., Jost, 2017).
However, the relationship between ideology and pseudo-profound bullshit has not been found consistently: for example, Hart and Graether (2018) found a positive relationship with conservatism only in the first study.Additionally, the relation with neoliberalism was somewhat lower in a Serbian sample (Gligorić & Vilotijević, 2020b).Indeed, it seems that the relationship between ideology and pseudo-profound bullshit is more complex: Nilsson, Erlandsson, and Västfjäll (2019) found a positive relationship with social conservatism within a Swedish sample but failed to replicate one with economic ideological preference.In fact, they found that economical centrists (and even leftists) were most inclined to fall for pseudo-profound bullshit.One of the reasons for inconsistent findings might be the fact that political ideology is differently conceptualized in the U.S. and Sweden.Firstly, in the U.S (which has a two-party system vs Sweden's multi-party one), the ideological scale is moved to the right.Secondly, ideological division in Sweden is primary concerning economic issues and less so about social ones (Nilsson et al., 2020).
Although a promising line of research, pseudo-profound bullshit and political ideology are relatively distant psycho logical constructs.Therefore, it is reasonable to investigate this relationship using more political content, which is why we explored another type of bullshit-political.

Political Bullshit
The idea of bullshit in politics is not new.For example, Orwell (1946) noted how political speech is filled with obscure language, listing devices that foster such rhetoric (e.g., using pretentious diction and meaningless words).More recently, Frankfurt (1986Frankfurt ( /2005) ) saw politics (as well as advertising and public relations) as a prime realm where bullshit emerges, whereas Brandenburg (2006) emphasized that bullshit "…has become a structural property, a logical necessity of political communication" (p.3).Political marketing, communication, and rhetoric (or a combination thereof) can give rise to political bullshit.
Even though research suggests that bullshit emerges in politics, a more important question is what political bullshit is, i.e., how it looks like.Sterling and colleagues (2016) suggested that bullshit in politics would not rely on abstract, pretentious words as pseudo-profound bullshit does because "a politician might deliberately simplify language to broaden his or her appeal, stating something that is vague and platitudinous, like 'I believe in America!'" (p.358).To this end, Brandenburg (2006, p. 4) illustrates political bullshit using a part of a speech by a UK politician from the Labour Party, David Miliband: "… Our vision is compelling.Civic pride based on a new age of civic power, not for some of the people, but for all of the people, all of the time." Hammack and Pilecki's (2012) narrative framework is useful in better understanding why political bullshit is used by politicians and what elements make it receptive to people.In this framework, political bullshit can be considered a form of narrative used to politically influence people's thinking and actions.According to this perspective, language that politicians use serves the purpose of achieving social coherence for political narratives.The "stories" used by politicians help members of a society engage in a political program which is often long and complex, covering many different topics.Politicians and parties try to unite members of society by relying on more abstract and vague communication which at the same time gives people a sense of continuity and personal coherence to deal with the complexity.This can be done, for example, by including in these narratives elements that are meaningful to the audience such as references to social categories ("America", "all of the people", "Britishness").We believe that, when this happens, such rhetoric becomes an instance of political bullshit which is receptive to people because it meets the meaning in solidarity principle of narratives as it fulfills a person's need to see oneself as part of a larger collective with similar others within a particular time and place (Hammack & Pilecki, 2012).Political bullshit meets another principle of the narrative framework as it can also put the mind in action.Political bullshit can motivate people to engage in action (i.e., voting for a political party) as it may appeal to the audience's emotions (i.e., pride, anger) and identification with the political parties or issues.
Another stream of research on bullshit in politics concerns theoretical speculation that focused on the Brexit campaign (e.g., Hopkin & Rosamond, 2018) and Donald Trump's presidency (Kristiansen & Kaussler, 2018).Similarly, the idea of political bullshit was also used to discredit political opponents or movements (e.g., Brexit campaign; Ball, 2017).Nevertheless, empirical research is needed to examine whether individuals at different points of the political spectrum use more political bullshit or are more susceptible to it.Overall, while research has discussed the idea of bullshit in politics, it has not clearly conceptualized, or more importantly, attempted to measure this concept.
We define political bullshit as statements of political content that intend to persuade voters, but are so vague and broad that they are essentially meaningless.We agree with Pennycook and colleagues' (2016) notion that the politicians' goal is often to "say something without saying anything [and] appear competent and respectful without concerning themselves with the truth" (p.125).Therefore, political bullshit represents political communication which (1) has no regard for truth (Frankfurt, 1986(Frankfurt, /2005)), (2) promotes one's goals and agenda, i.e., it is proactive in that it persuades to vote (Brandenburg, 2006;Hammack & Pilecki, 2012;Reisch, 2006) and is (3) essentially unclarifiable (Cohen, 2002).To this end, we view the political communication that Chomsky described as a prototypical bullshit communication.
Our definition of political bullshit also resonates well with the finding that bullshitting has two facets: persuasive and evasive (Littrell et al., 2021; see also the difference between proactive and defensive bullshit; Brandenburg, 2006).It is political communication in particular that aims to persuade (attract voters, get support for an agenda), while politicians often keep communication abstract in order to evade directly answering questions that might lead to a decrease in support (Rogers & Norton, 2011).Conjoining these features, politicians might try to persuade using statements too vague to repulse voters (i.e., evades meaningful proposals).In this way, political bullshit departs from demagoguery which includes lying (bullshit is indifferent to the truth) or making impossible promises (bullshit is too vague to promise anything meaningful).Political bullshit is also different from populism as it does not imply any divisions between "us" and "them", although populists might sometimes engage in bullshitting (Hopkin & Rosamond, 2018).

The Present Research
In the present research, we constructed political statements that we deemed as political bullshit whose endorsement should be a measure of political bullshit receptivity.As argued previously, "it is not the understanding of the recipient of bullshit that makes something bullshit, it is the lack of concern (and perhaps even understanding) of the truth or meaning of statements by the one who utters it" (Pennycook et al., 2016, p. 125).Therefore, in our case, we made political statements that lack actual meaning and are indifferent to truth.The threefold aim of the present study was to: (1) theoretically outline the concept of political bullshit, (2) operationalize the receptivity to political bullshit and (3) explore the relation of political bullshit receptivity with other traits, ideology, and voting behavior.
We expected positive correlations between the endorsement of political bullshit statements, slogans, and bullshit political programs since all three measures fulfill our criteria of political bullshit (convergent validity).We expected that each of these measures shows a one-factorial structure (factorial validity).We also included factual political statements that have the same format as political bullshit statements but should not be regarded as persuasive (see the difference between mundane and pseudo-profound statements; Pennycook et al., 2015).Next, we hypothesized that endorsement of political bullshit would be positively correlated with the receptivity to pseudo-profound bullshit because both represent one type of bullshit (cf.Čavojová et al., 2022).Based on information-processing ideological asymmetries (Jost, 2017) and the literature on pseudo-profound bullshit, we hypothesized that receptivity to political bullshit would positively correlate with conservatism (Pfattheicher & Schindler, 2016) and neoliberalism (Sterling et al., 2016).Finally, we expected a low correlation with populism because these are different concepts (discriminant validity).Prediction of voting behavior is exploratory as there are no conclusive results in the literature (Nilsson et al., 2019).All these hypotheses have been pre-registered at the Open Science Framework (OSF; see Supplementary Materials) To address these research questions, we conducted three pre-registered studies in early 2020 in three countries-the U.S., the Netherlands, and Serbia.We also performed a meta-analysis to average the effects and test for homogeneity which would indicate if the patterns vary across countries.Socio-politically, these countries are somewhat similar: they are all liberal democracies, rely on market economy, and belong to the Western culture.However, some large differences also exist: the U.S. is a presidential democracy with a two-party system, while Serbia and the Netherlands are parliamen tary democracies with multi-party systems.Additionally, Serbia is not a WEIRD country (compared to the U.S. and The Netherlands) and it has a socialist history.Furthermore, the main denomination in Serbia is Orthodox Christianity, while it also has strong influences from the Orient (due to long Ottoman rule).These cross-country differences allowed us to test the validity of the concept across different political contexts and get more precise estimates.

Overview of the Studies
All three studies had a correlational design, used survey methodology, and were programmed in Qualtrics.Each study was conducted in the native language (English, Serbian, and Dutch) after translation and back-translation by native speakers.Thus, the participants answered the same scales, except for the voting behavior as parties/candidates differ across countries.Before the main studies, political bullshit statements, factual political statements, and slogans were piloted (see Supplementary Materials on OSF, section Pilot Study).Materials for the pilot and main studies (in all languages), datasets, analyses scripts, and outputs are available as Online Supplement (for sample size determination, see Bayesian explanation document).None of the studies had missing data, as each question requested an answer before allowing to move to the next page.Data analyses were performed in JASP (Bayesian correlations) and R (all other analyses; packages: lavaan for confirmatory factor analyses, metaBMA for meta-analyses).Appendices to which we refer in text can be found in the Supplementary Materials, under Online supplement.

Study 1 -U.S. Sample Method Participants
In total, 183 participants from Prolific completed the survey which took around nine minutes.They were paid 1.09£ for their time.To ensure high-quality data, we included two attention checks; if any was answered incorrectly, the survey would terminate.As pre-registered, we excluded four participants as multivariate outliers (Mahalanobis distance, α = .01)on continuous dependent variables (all that participants filled out), while no participants were excluded based on pre-registered minimum completion time (180 seconds).Therefore, we analyzed the data of 179 participants (58.1% females, 36.9% males, 5.0% "other"; M age = 29.6,SD age = 10.7).One participant indicated education less than the high school, 44 had completed high school, 72 were students, 24 had an undergraduate degree, and 38 had a graduate degree.

Materials
After reporting their demographic characteristics, participants completed two single-item indicators of the ideological position on social and economic issues using a seven-point scale (1 = left-wing to 7 = right-wing).Participants next filled out the following scales in the order in which they are presented here (items were randomized within the scales).Descriptives and reliabilities of scales are given in Table 1.
Pseudo-profound bullshit receptivity was measured with ten items which participants rated on profundity (1 = not at all profound to 5 = very profound) (Pennycook et al., 2015).Five truly meaningful statements were included as fillers to conceal the emptiness of pseudo-profound items.
Bullshit political program endorsement.Another political bullshit measure was the endorsement of three political programs for the presidential elections in the fictitious country of Gonfel.We constructed these programs to be meaningless and empty.Participants rated how much they 1) would support the program and how likely they 2) would vote for the candidate (1 = not at all to 5 = very).This means that six items in total measured bullshit political program endorsement.
Together with the bullshit political programs, three meaningful political programs were included as fillers to conceal the vanity of bullshit programs.Therefore, participants read six political programs (12 items), but meaningful programs were not used in analyses.Next, participants rated how convincing the five slogans (e.g."I believe in Gonfel!") are (1 = not all convincing to 5 = very convincing).This was followed by rating how persuasive ten political bullshit statements are (e.g., "To politically lead the people means to always fight for them"), which were presented together with five factual political statements ("The president and prime minister have important political functions").
Support for populism was measured by indicating agreement with four items (e.g., "Politicians should listen more closely to the problems the people have") on a five-point scale (Elchardus & Spruyt, 2016).Support for Neoliberalism was measured using two items ("The free market economic system is a fair system" and "The free market economic system is an efficient system") from the Fair Market Ideology scale (Jost, Blount, Pfeffer, & Hunyady, 2003) on which participants indicated agreement using a five-point scale.
Voting behavior.Participants answered whether they (1) voted at the last U.S. presidential elections (2016).If they indicated "yes", they were asked to (2) indicate the candidates they voted for.Next, participants indicated whether they (3) planned to vote at the next U.S. presidential elections (2020).If answered "yes", participants were asked to (4) indicate for which candidate they would vote.
Each of the three bullshit political programs had two responses-likelihood to 1) vote for candidate and 2) support the program.Besides one higher-order factor, a proper CFA model would include residual covariances between the two questions related to the same program, and three questions that have the same format across programs (one about voting and one about support).However, we could not apply residual covariances because the model becomes saturated.Therefore, we performed EFA (same estimation method and rotation) on six items concerning three programs that indicated a one-factor solution (52% of variance explained), based on both parallel analysis and eigenvalues > 1.All items had factor loadings higher than .60(see Appendix A, Supplementary Materials).Overall, the factor structure indicates that all three measures are best conceptualized as one-factor measures, which is why we calculated means as scores.

Correlations
Inter-correlations, descriptives, and reliabilities of the scales are given in Table 1.As the table shows, there are medium to high correlations between political and pseudo-profound bullshit measures.Endorsing political factual statements was also correlated with all bullshit measures suggesting response bias; that is, some participants were susceptible to any kind of political statements.To partial out this variance and thus investigate correlations with political bullshit sensitivity, we calculated partial correlations between bullshit measures when political factual statements are controlled for (see Appendix B1, Supplementary Materials).These indicated that the correlation between bullshit measures remains even after controlling for response bias.This, together with the factor analyses, indicates that there is a susceptibility to this kind of bullshit other than response bias.Interestingly, only some political bullshit measures correlated with ideological measures (social and economic ideology, neoliberalism) and populism.Endorsement of bullshit political programs correlated with all ideological measures and populism.While slogans correlated with the support for neoliberalism and populism (but not with the two ideology measures), political bullshit statements did not correlate with any of them.Finally, participants distinguished between political bullshit statements and factual ones so that political bullshit statements were rated as more persuasive than the factual statements (BF +0 > 100).

Voting Predictions
Lastly, we tested whether political bullshit measures could predict voting behavior.As Appendix C1 (Supplementary Materials) shows, most of the respondents voted in the 2016 elections (doing that for Hilary Clinton) and planned to vote in the 2020 elections.As for the 2020 elections, Sanders, Biden, and Trump accounted for most of the planned votes.We tested whether political bullshit statements, slogans, and bullshit political programs predicted voting candidates in the 2016 elections (binary logistic regression with outcomes "Trump" and "Clinton"; ns = 20 and 75 respectively), and candidates in the 2020 elections (multinomial logistic regression with outcomes "Trump", "Sanders", and "Biden"; ns = 26, 58 and 59 respectively).As pre-registered, we only selected cells that contained more than 10% of data (two candidates in the 2016 elections and three in 2020).Political bullshit could distinguish between the candidates for which one voted: the probability that one voted for Trump (vs Clinton) increased with higher scores on bullshit program endorsement (B = .737,OR = 2.09, z = 2.275, p = .02).Similarly, when predicting the voting choice of the 2020 elections, an increased score on bullshit program was found to be associated with a higher likelihood to be Trump's voter compared to Biden, B = .823,OR = 2.28, z = 2.650, p = .008,and Sanders, B = .575,OR = 1.78, z = 1.878, p = .060(Figure 1).Importantly, the same pattern of results remains when the factual political statements are controlled for (see note at the end of Appendix C1 in the Supplementary Materials).In sum, our operationalization of political bullshit showed good validity.Three measures-political bullshit statements, slogans, and programs-were moderately-to-highly correlated with each other and with pseudo-profound bullshit (convergent validity).This relationship held even after controlling for answers on factual political statements.While the factorial validity (one-factorial structure) could be improved, political bullshit measures had, as predicted, relatively low or no correlations with populism, demonstrating discriminant validity.Two political bullshit measures (slogans and bullshit programs) correlated with neoliberalism, while program bullshit also correlated with both ideological measures (concurrent validity).However, political bullshit statements showed inconclusive correlations with ideological measures, with some support for the null hypothesis regarding the relation to economic ideology.We also found some support that program bullshit predicts voting behavior-Trump's voters were more susceptible to it in both 2016 and 2020.

Study 2 -Serbian Sample Method Participants
In total, 254 participants were recruited by distributing the study link on Facebook posts of different news pages in Serbia.Sixty-eight participants were excluded for failing attention check questions.One participant was removed as a multivariate outlier (Mahalanobis distance), while no participants were removed for speeding.This left the final sample of 185 participants (55.7% females, 44.3% males; M age = 37.3, SD age = 11.3).Three participants indicated education lower than the high school, 47 had completed high school, 26 were students, and 109 had a university degree.

Materials
Participants completed the Serbian-translated materials used with the U.S. sample.The only difference was in questions about voting behavior.Participants in Serbia also first indicated whether 1) they voted at the last parliamentary elections (2016).If answered "yes", participants were asked to (2) indicate for which party they voted (see Supplementary Materials).Finally, participants were presented with the following question: "If you were offered the following lists for the 2020 parliamentary elections, which one would you choose regardless of the boycott?"The question was phrased in this way because, at the time of conducting the study, most of the opposition parties declared a boycott of the 2020 elections in Serbia.Therefore, the question about the next elections could have not been phrased in the same way as in the U.S. and the Netherlands.Offered parties are given in the Supplementary Materials.

Results and Discussion
For brevity purposes, we fully report the factor structure of political bullshit measures for Serbian and Dutch samples at the OSF (see Supplementary Materials, document "Factor structure of political bullshit measures in Serbian and Dutch sample" and Appendix A, document "Appendices", under Online supplement).In Serbia, we also performed EFA on political bullshit statements as CFA did not show a good fit.A one-factor solution in Serbia explained 30% of the variance for political bullshit statements, 36% for slogans, and 58% for bullshit programs.In the Netherlands, CFA on political bullshit statements showed a good fit, while the proportion of variance explained was 34% for slogans and 52% for bullshit statements.Overall, we found a pattern similar to the U.S. sample, indicating that each measure can be conceptualized as a one-factor measure.

Correlations
All correlations between measures, descriptives, and reliabilities of the scales are given in Table 2. Similarly to the first study, all bullshit measures correlated positively-both political and pseudo-profound.Endorsement of political factual statements was correlated with political bullshit statements and slogans, but not with bullshit programs and pseudo-profound bullshit.This again suggested some response bias.As in Study 1, positive correlations were found between all bullshit measures controlling for political factual statements (Appendix B2, Supplementary Materials).In line with the first study, only some of the political bullshit measures correlated with ideological measures and popu lism.Endorsement of both bullshit programs and slogans correlated with ideology on social issues and populism, while slogans also correlated with support for neoliberalism.As in Study 1, political bullshit statements did not correlate with any of the measures.In both studies, a comparable correlation pattern was found: moderate to high intercorrelations between bullshit measures, some positive correlations between political bullshit measures and ideological measures (ideology and neoliberalism support), and none to small correlations with populism.Finally, as in Study 1, political bullshit statements were rated as more persuasive than the factual statements, BF +0 > 100.

Voting Predictions
We also tested whether political bullshit measures could predict voting behavior.Appendix C2 (Supplementary Materials) shows how participants responded to three voting questions.While most of the respondents voted in the 2016 elections, there was a high variance regarding voted parties ("SNS", "DJB", and "DS" passed the threshold, though note that cell sizes are small: ns = 16, 21 and 15 respectively).For 2020 elections, none of the parties passed the required 10% for data analysis.Therefore, we only tested whether political bullshit measures predicted voted party in 2016.Multinomial logistic regression with outcomes "SNS", "DJB", and "DS" showed that political bullshit did not predict which party a participant voted for, χ 2 (6) = 8.152, p = .227,R 2 McFadden = .07.
Overall, the analysis of intercorrelations showed similar patterns in this study and Study 1, illustrating good convergent validity, while some differences emerged on the ideological variables, possibly due to country differences.Given that the U.S. and Serbia are socio-politically different, and that studies employed different sample collection strategies, small differences in the relationships between the measures were not surprising.We now report the findings in the Netherlands-a country similar to the U.S. in some aspects (e.g., WEIRD country), and Serbia in others (e.g., European parliamentary democracy).

Method Participants
We recruited participants in two ways-students filled out the survey in return for research credits (n = 51) and we recruited participants through Prolific (n = 143).In total, 172 completed the survey without failing attention checks.One participant was removed as a multivariate outlier (Mahalanobis distance).One participant was removed for speeding.
This left a final sample of 170 participants (45.3% females, 54.7% males; M age = 27.1,SD age = 10.1).No participants indicated education lower than high school, 23 had completed high school, 83 were students, 37 had an undergraduate, and 27 had a postgraduate university degree.

Materials
Participants completed the Dutch-translated materials used in previous studies.Regarding voting behavior, participants in the Netherlands first indicated whether 1) they voted at the last parliamentary elections (2017) with options: "yes", "no", and "I was not allowed to vote" (e.g., minors).If indicated "yes", they were asked to (2) indicate for which party they voted.Next, participants were asked if they intended to vote at the next (2021) parliamentary elections, "yes" or "no".If answered "yes", they were asked to indicate the party for which they would vote (see Supplementary Materials).
Similarly to the previous studies, only some of the political bullshit measures correlated with ideological measures and populism.While political bullshit statements correlated with all ideological variables, endorsement of slogans did not correlate with any of them.On the other hand, bullshit programs correlated with social ideology and neoliberal ism support, and was the only political bullshit measure that correlated with populism.Overall, the pattern of the correlations-moderate to high intercorrelations between bullshit measures, some positive correlations between political bullshit measures and ideological measures (ideology and neoliberalism support), and low correlations with populismwere obtained in all three studies.Finally, as in the previous studies, political bullshit statements were rated as more persuasive than the factual statements, BF +0 > 100..241+ Note.**Bayes factor in favor of one-sided alternative averaged across heterogeneity (BF +0 ) > 10; *BF +0 > 3. † Bayes factor in favor of null against one-sided alternative positive averaged across heterogeneity (BF 0+ ) > 3. + Bayes factor in favor of two-sided alternative averaged across heterogeneity (BF 10 ) > 3. R Bayes factor in favor of random H 1 against fixed H 1 > 3.

Probing Ideological Effect: Exploratory Analyses Ideology and Political Bullshit
To better understand the relationship between ideology and political bullshit, we performed linear regressions, predict ing political bullshit (political bullshit statements, slogans, bullshit programs) with three ideological measures (ideology on social and economic issues, and neoliberalism) included simultaneously.Therefore, we performed nine linear regres sions in total, three in each country (one for each political bullshit measure).In the U.S., in all three regressions, only neoliberalism was a significant predictor (βs > .181,ps < .05).In Serbia, social ideology was significant in all three regressions (βs > .158,ps < .05),while neoliberalism predicted political bullshit statements and slogans (βs > .163,ps < .05).In the Netherlands, neoliberalism was significant predictor in all three regressions (βs > .177,ps < .05),while social ideology predicted bullshit programs (β = .285,p = .002).Overall, the pattern of results suggests that neoliberalism is the most important ideological predictor (significant in eight out of nine regressions), while social ideology was still significant in four analyses.On the other hand, economic ideology was not associated with political bullshit.All regression tables are reported in Appendix E, Supplementary Materials.

Predicting Voting for Left, Center, and Right-Wing Parties on the Merged Dataset
Finally, we performed exploratory analyses on the combined datasets 1 .To increase the power for predicting voting behavior, we merged three datasets by grouping the parties/candidates into the left, center, and right categories for all three countries.Appendix D1 (Supplementary Materials) shows how the categories were formed across countries for the last elections.Multinomial regression showed that political bullshit measures could predict for which ideological candidate/party participant voted, χ 2 (6) = 28.000,p < .001,R 2 McFadden = .055.Specifically, as Figure 2 shows, scoring higher on political bullshit program increased the probability to vote for right political options: left-wing voters had lower slopes than center voters (B = .508,z = 2.550, p = .011),who had lower slopes than right-wing voters (B = 2.629, z = 3.316, p < .001).
1) To justify the merging, we tested the configural measurement invariance of political bullshit measures, i.e., whether one-factorial structure of measures fits in all countries.Political bullshit statements showed acceptable fit for a one-factor model that included residual covariance between items 1 and 3 (see factorial structure in Serbian sample), RMSEA = .076,SRMR = .049,CFI = .926,TLI = .903.Slogans showed good fit for a one-factor model that included residual covariances between items 3 and 4, and 4 and 5 (see CFAs in U.S. and Serbian samples), RMSEA = .066,SRMR = .026,CFI = .987,TLI = .956.Therefore, configural measurement invariance held after accounting for residual covariances that were detected in separate country samples.For bullshit programs, we could not perform these tests because the proper CFA model is saturated.However, given that EFA shows that one-factor solution is appropriate in all three countries, we believe that merging the datasets is justified.

Bullshit Program Endorsement and Voting in the Last Elections for All Countries
We then tested whether political bullshit measures predicted the ideological position of the candidate/party for which one would vote at the coming elections (Appendix D2, Supplementary Materials, shows category formation).Contrary to the last elections, multinomial regression showed that political bullshit measures could not predict the ideology of the party one would vote, χ 2 (6) = 7.660, p = .264,R 2 McFadden = .012.However, when political bullshit statements were the only predictor in the model, it could distinguish between left and right voting options, B = .431,Z = 2.11, p = .035.This was also found for the bullshit program endorsement, B = .327,Z = 2.108, p = .035.These findings about the coming elections mirror the pattern found for the last elections.

General Discussion
The present research investigated the receptivity to political communication which we deemed as "political bullshit".Admittedly, this term might be controversial as "bullshit" has a different colloquial connotation, and something more neutral like "political emptiness" could be more suitable.Indeed, as our current studies show: endorsing political messages correlated with actual voting behavior.This indicates that empty political statements are not perceived as bullshit by the voters at all.However, using a different term would disregard the similarity of this concept to the already well-established concept of (pseudo-profound) bullshit.We used the term "political bullshit" to label political communication in which politicians use vague but simplified statements to mobilize voters.Political bullshit meets the principles of Hammack and Pilecki's (2012) narrative framework as it can give people a sense of coherence in complex political programs, as well as put "the mind in action" when mobilizing voters.In three pre-registered studies in three countries, we developed three different measures of political bullshit (state ments, slogans, and programs) and found support for our operationalization of the receptivity to this communication.In each country, as well as in the meta-analysis, we obtained moderate-to-high intercorrelations between political bullshit measures and pseudo-profound bullshit.We also found the relationship between political bullshit endorsement and ideological orientation: right-wing and neoliberalism supporters were more receptive to political bullshit.Finally, political bullshit could predict voting behavior to some extent, showing that endorsement of political bullshit (especially bullshit programs) was associated with a higher probability to vote for right-wing candidates/parties.
In line with the theoretical framework of political narratives, we also found evidence that political bullshit can put the mind in action (Hammack & Pilecki, 2012).Political bullshit was associated with voting choice in the U.S. In the Netherlands, it was only associated with the choice of party one voted for but not for the coming elections.Interestingly, political bullshit was not associated with voting behavior in Serbia, which was probably due to the small sample size.Analysis of the merged dataset supported the notion that a higher score on political bullshit increased the probability to vote for right-wing parties.These results are in line with the finding that the favorable view of Republican candidates is correlated with pseudo-profound bullshit receptivity (Pfattheicher & Schindler, 2016).
Although in line with previous research, the reason for ideological asymmetries in susceptibility to political bullshit remains unclear.One possibility would be the above-mentioned ideological differences which showed that right-wing individuals engage in simpler information processing (Jost, 2017).Given that individuals on the right have a more intuitive thinking style and rely more on heuristic processing (Jost & Krochik, 2014), it might be that they fail to detect the vagueness of political bullshit.Indeed, as Sterling and colleagues (2016) found, the relationship between neoliberalism and pseudo-profound bullshit disappears when heuristic and biased processing, and faith in intuition are included in the model.However, there are also reasons why conservatives and right-wing voters would be less receptive to vague political rhetoric: they have a larger need for certainty and closure (Jost, 2017) which is the opposite of political bullshit.Therefore, the exact mechanism remains to be investigated by future research.

Limitations and Future Directions
Our political bullshit measures were relatively short (between five and ten items) which might have increased measure ment error.The addition of new items is warranted, while the CFA indices also showed there is room for improvement.Also, the number of studies for conducting a meta-analysis was small.We assume that because of the small number of studies, only two correlations had evidence for heterogeneity.It is reasonable to assume that the correlation between political bullshit and ideological measures might vary across countries due to the different conceptualizations of ideology.Yet, a strength of the meta-analytic approach is that we could show there was no evidence for heterogeneity for any of the correlations between bullshit measures.This suggests that the intercorrelations were homogenous across countries and implies the robustness of political bullshit as a concept.
The reliability of the populism measure (Elchardus & Spruyt, 2016) was relatively low (varying between .51 and .69).Therefore, low correlation with political bullshit measures that we observed might be an underestimation of the true one.However, the observed correlations were found consistently across studies, suggesting a difference between the two constructs.Additionally, there are theoretical reasons for distinguishing between political bullshit, which is based on abstract and vague statements, and populism, which concerns a concrete conception that politicians should be more similar and close to ordinary people.In any case, we believe that political bullshit and populism are distinct constructs, while future research should investigate the relationship in more detail (e.g., relationship's size between the receptivity to political bullshit and populism, or how politicians as producers use political bullshit and populism in real political programs).
We had a small number of observations in the voting categories, which decreased power to detect the relationship between political bullshit and voting behavior (e.g., 2016 elections in Serbia), and did not allow for analyses of the 2020 elections in Serbia.However, analyses in the U.S., the Netherlands, and on parties combined in the left/center/right categories for all countries support the idea that right-wing voters might be more susceptible to political bullshit.Another reason that this might be the case is the different linguistic repertoire that left-wing and right-wing individuals employ (Sterling et al., 2020).For example, one of the bullshit programs mentioned the idea to "restore the soul of our country", which seems to contain conservative rhetoric.Therefore, it is possible that our political bullshit items contained words more used or familiar to right-leaning individuals, and thus the relationship could be a confound of item content.However, while this limitation might explain the relationship with social ideology, it could not explain why neoliberalism is still consistently predicting political bullshit, even after accounting for one's social (and economic) 2) Consider for example neoliberal dogma that free market is a solution to all problems (not only economic), even though implementation of these policies often end with disastrous consequences (e.g., in Russia after the dissolution of Soviet Union) (Harvey, 2007).
ideology.Future research should investigate this question in more depth, preferably with items cleared of ideological content.
Another limitation in our research is relatively high correlations between the measures of political bullshit and factual statements, which suggests a general tendency to endorse any type of political statement.However, this high correlation is most likely due to the common-method variance, given that these statements were all presented together.As exploratory factor analyses of political bullshit and factual statements, and differences in means (participants endorsed factual statements to less extent) demonstrate, participants distinguished between the two.Future research should take into account the tendency to agree with all statements, but also attempt to reduce common-method bias.
Lastly, a natural question arises-who uses political bullshit?Our research does not provide an answer; it only answers who might be more receptive to it.One way to answer this question could be to take a similar approach that Pennycook and colleagues (2015) took with Deepak Chopra and pseudo-profound bullshit.If items actually uttered by politicians (Deepak Chopra) have the same psychological reality as political (pseudo-profound) bullshit, it would be a strong argument in favour that it indeed is political bullshit.In any case, we do believe that all politicians, to some extent, use political bullshit as exemplified in the introduction with Joe Biden and Donald Trump.

Figure 1
Figure 1 Bullshit Program Endorsement and Voting in the 2020 Elections

Table 1
Means, Standard Deviations, Reliabilities of the Scales, and Correlations Between Measures

Table 3
Means, Standard Deviations, Reliabilities of the Scales, and Correlations Between Measures

Table 5
Meta-Analytic Effects of Correlations Between Political Bullshit Measures and Ideological Measures, and Populism *R