Does social distancing really work? Are vaccinations safe? Does it help to wear face masks? The COVID-19 crisis has raised many questions, many of which boil down to one key question: who do we ultimately trust as the most credible authorities to guide our beliefs and behaviour concerning the virus? While some people have dived into Google Scholar themselves to 'do their own research', most people (eventually) accept the efficacy and safety of COVID-19 vaccines because their doctor/scientists/governments/medical advisory boards told them that they should.
In a recent study1, we experimentally investigated source credibility effects at play in the context of science and spirituality. We found evidence for what we call the 'Einstein effect': people tend to confer more credibility to incomprehensible claims when attributed to a scientist than the very same claims when attributed to a spiritual guru.
While the pandemic has clearly highlighted the relevance of source credibility effects, the idea for our study actually stemmed from before anyone had heard of COVID-19. In part, the idea originated from work by Schjoedt and colleagues2, in which they showed that Christian participants reported more strongly 'feeling God's presence' through prayer when supposedly performed by a charismatic Christian than by a non-Christian (in fact the prayers were identical and only the source was manipulated). A simplistic but perhaps tempting take-away from this study is that religious individuals simply believe whatever a person they trust says. In other words, religious believers are wide-eyed and gullible.3 This fascinating study got us wondering: is this source credibility effect specific to religious believers? Are each of us - religious and non-religious - susceptible to taking information from people we 'trust' based on their social or professional group at face value? Whereas a 'Christian known for his healing powers' might appeal as a credible authority for Christians, a 'leading scientist in the field of particle physics' might have a similar appeal for science-minded individuals, or even universally. And while very few of us are able to fully comprehend why E = mc2, the fact that Albert Einstein said so means that even fewer of us doubt its truth. Indeed, even within the scientific context, where skepticism is among the highest virtues (as exemplified by the Royal Society's motto 'Nullius in verba' - 'Take nobody's word for it'), there are plenty of examples of source credibility effects resulting in the uncritical acceptance of (dubious) claims.
Consider for instance the (in)famous 'Sokal hoax'. In 1996 renowned physics professor Alan Sokal published an article in which he merged quantum physics and postmodernism – interspersed with genuine quotations from physicists and philosophers – to arrive at the conclusion that, basically, quantum gravity is a social construct4. The point, as he admitted three weeks after the article had been published, was to demonstrate how easily you can get away with 'intellectual hot air' if you flatter your audience's ideological position, even to the point of being published in a reputable academic journal5. This hoax could be considered the perfect recipe for effective bullshitting: the article was (1) written by a respected scientific authority from a prestigious university (New York University), and (2) it contained a collection of ambiguous, obscure, incohesive but seemingly erudite statements that (3) supported the target audience's worldview - in this case the postmodern notion that objective reality does not exist. Of course, Sokal intentionally published nonsense to make a point about the power of intellectual flattery, rather than about him being an authority who can get away with bullshit. At the same time, it seems unlikely that the paper would have been accepted had it been submitted by an undergraduate student; it probably didn't hurt to have 'Alan Sokal, Professor of Physics at New York University' listed below the article title.
The Sokal hoax is not a unique incident, as multiple fake papers of sting operations have been published over the years, often to expose flaws in the academic publishing process.6 If even expert peers fail to identify nonsense right in front of their eyes, how can the general public possibly be expected to separate the scientific wheat from the bullshit chaff? How susceptible are non-experts to accepting nonsense from people who appear on talk shows and podcast and calls themselves scientists?
We wanted to design a study that would allow us to investigate if people would be more inclined to trust scientists compared to spiritual gurus, irrespective of their attitudes about the content of the presented information. We were also interested in how their religious worldviews would come into play. The solution we came up with was quite simple: present participants with gobbledegook attributed to either a spiritual guru or a scientist. Across 24 countries we found that people (N = 10,195) deemed statements from a scientist more credible than from a guru. This Einstein effect differed for religious versus non-religious participants: individuals scoring low on religiosity considered the statement from the guru less credible than the statement from the scientist, while this difference was less pronounced for highly religious individuals.
But what does this mean? Is trust in scientists a bad thing? Are we all gullible and 'stupid'? Well, no. As we speculate in our article when interpreting the results of our study, it might actually be rational to use source cues for information you cannot readily understand yourself. That is, when you cannot evaluate a claim yourself, the most rational thing to do may be to either suspend judgment or calibrate judgments to pre-existing beliefs about the credibility of the source of the claim. In other words, if you consider scientists generally competent and sincere, it might make sense to give a positive judgment of the difficult-to-evaluate claim from an unknown scientist. After all, credible experts often acquired credentials based on having discovered phenomena that may appear intuitively dubious, such as the working of vaccines ('inserting a virus prevents disease') or the causes of climate change ('humans are changing the weather')7. And despite the Royal society's motto, scientists themselves must also trust other scientists, as they can hardly reanalyse the results in every single paper they cite.
We may thus view scientific source credibility as a double-edged sword. On the one hand, we need to stand on the shoulders of giants to make scientific progress. On the other hand, things can go awry if we uncritically accept information, purely on the basis of scientific authority, as the replication crisis in the social sciences perhaps attests8. Nevertheless, especially in times of crisis like the COVID-19 pandemic or the climate crisis, the Einstein effect appears a useful heuristic: in general, people trust scientists more than spiritual gurus. At least to scientists, that is reassuring.
References and notes
1. Hoogeveen et al. The Einstein effect provides global evidence for scientific source credibility effects and the influence of religiosity. Nature Human Behaviour. doi:10.1038/s41562-021-01273-8 (Advance online publication).
2. Schjoedt, U., Stødkilde-Jørgensen, H., Geertz, A. W., Lund, T. E. & Roepstorff, A. The power of charisma-perceived charisma inhibits the frontal executive network of believers in intercessory prayer. Social Cognitive and Affective Neuroscience. 6, 119–127. doi:10.1093/scan/nsq023 (2011).
3. Note that this may be a tempting but speculative interpretation; it's not the conclusion by the original authors. Yet it does fit into the broader research line on dual process accounts of religious belief, suggesting that religious believers are more likely to rely on intuitive rather than analytic thinking.
4. Sokal, A. D. Transgressing the boundaries: Toward a transformative hermeneutics of quantum gravity. Social Text 46, 217–252. doi:10.2307/466856 (1996).
5. Sokal, A. D. A physicist experiments with cultural studies. Lingua Franca 6, 62–64 (1996).
6. See https://en.wikipedia.org/wiki/List_of_scholarly_publishing_stings for some notable examples.
7. Levy, N. Due deference to denialism: Explaining ordinary people's rejection of established scientific findings. Synthese 196, 313–327. doi:10.1007/s11229-017-1477-x (2019).
8. Lindsay, D. S. Replication in psychological science. Psychological Science 26, 1827–1832 (2015).