People have a surprising ability to disagree on the nature of their shared reality, even after the science has been settled. The myths about mRNA vaccines that pervade the current effort to vaccinate against Sars-Cov-2 are a timely example of this phenomenon. Another example is climate change, where efforts to mitigate the climate crisis are undermined by the unwillingness of some politicians and voters to fully believe in human-made climate change, despite a decade-old consensus among scientists. But why do some people accept the scientific evidence on climate change while others don’t? Is it because some people are smarter than others? Or because some are under the spell of the fossil fuel industry and others aren’t? While we know quite a bit about the formation of climate change opinion in the United States (where the fossil fuel lobby and partisan identities play an important role), climate attitudes in the rest of the world remain understudied.
I started to think more about this topic during my Master’s degree in International Relations and I was surprised to see that my disciple was quite good at explaining differences in moral norms across countries, but that it had nothing to say about differences in empirical beliefs. Climate change seemed to be a good opportunity to study systematic variation in public beliefs across countries because climate denial has been much more salient in some nations than in others. In 2016, I eventually started to gather data on climate change beliefs around the world. But as it turned out, most surveys were conducted in the Global North and few pollsters cared to ask people in Latin America, Asia, or Africa about their opinion on global warming. Furthermore, almost all international surveys were online surveys and hence strongly biased in countries where many people lack access to the internet or telephones. The only exception was Gallup, a company that made a huge effort to collect representative data in more than 143 countries. However, their individual-level data on climate change opinion was proprietary, so I ended up copying numbers on country-level averages from Gallup’s website into an Excel sheet and running some basic statistics on these.
But crunching cross-sectional data only gets you so far. Without individual-level information, it was difficult to differentiate the effect of individual factors (like gender or education) from those of societal conditions. Moreover, since global climate change opinion has not been collected over a longer period, it seemed impossible to approximate the causal effects of changing societal circumstances. I remember sharing some preliminary results of my analyses at a conference when one discussant told me: “Oh, what a nice graph showing how democracy correlates with climate change belief. It seems you are almost finished – when are you going to publish?” But at this moment, all I thought was how boring these results were. I didn’t find them worthy of publishing at all.
But sometimes it pays off to be patient. In 2019, I had the chance to visit the Yale Program on Climate Change Communication. Shortly after landing in New Haven, I made two encounters that would change the trajectory of my research. At first, I met Anthony Leiserowitz, the founder and Director of the Yale Program on Climate Change Communication. We discussed the importance of studying climate change opinion beyond Europe and the US when he suddenly revealed to me that Yale had licensed the Gallup World Poll, which suddenly and unexpectedly gave me access to more than 400,000 individual data points containing information on climate change attitudes from all over the world. The second lucky coincidence was meeting Holger Virro, another visiting Ph.D. student at Yale. Holger and I arrived at the same Airbnb when we found out that we also rented rooms in the same apartment building in New Haven. Moreover, Holger was also working at Yale’s Environmental Studies department – but as a Geoinformatics scientist who uses machine learning methods to predict water qualities. Once, when we had a beer, he introduced me to the merits of the method he was using – random forests. Being a political scientist by training, I had rarely worked with machine learning algorithms before. But what Holger told me sounded really promising.
I spend the next couple of months at Yale digging deeper both into the new method and into the newly acquired dataset. Combining these vast amounts of complex data with random forests suddenly made it possible to analyze in unprecedented detail how societal circumstances predict individual attitudes and to trace the non-linear patterns in which country-level conditions relate to individual opinion. Exploring this new method was a lot of fun – many ancillary techniques that I ended up using have only recently been developed and there were a lot of unchartered territories in terms of data analysis and visualization. The only draw-back was that some algorithms turned out to be very computationally intensive. Before I got access to a high-performance cluster, I ran the code on my personal computer overnight. And because the fan of my heavily burdened laptop got so noisy, I had to stuff my Laptop into my closet every night.
In summary, I would say that sometimes it is worthwhile to let an analysis sit and wait a while until the right data and the right methods start to show up. As for global climate change opinion, a whole set of new and interesting datasets have been published recently, such as the climate change modules in the Afrobarometer or the Latinobarometer. I am very excited to see more research on these data (hopefully) being published soon.