"Science is just an opinion." This is a statement often heard from people who dislike the outcomes of scientific research or research-based policymaking. Their discontent leads them to question the value of scientific research and, quite often, to discredit individual scientists as well. In their view, these scientists are politically and ideologically motivated, almost certainly funded by corrupt governments or organizations. This happens, for instance, to scientists studying climate change, but also to researchers in health science or cultural studies. Regardless of the specific accusations or the field of study, these critics no longer regard science as a neutral and objective producer of knowledge. Instead, they see science as a human undertaking that comes with all the flaws and challenges of any other human activity. Unfortunately, there is some truth to this perspective.
Science is indeed the product of human effort, and scientists are not an exceptional breed devoid of emotions, ideals, or material needs. Most, if not all, they are ideologically motivated in some way to study the things they do—for example, because they want to make the world a better place, grow the economy, make people healthier, or develop beautiful theories. Eventually, they also need funding for their work, and to secure it, they may have to adjust their research questions and projects to align with the demands of funding agencies or the interests of private sponsors. In most cases, these factors are likely to shift researchers' attention slightly, but they rarely affect the scientific quality of the results. In rare cases, however, findings may indeed be biased toward the source of funding (e.g., pharmaceutical research is often funded by companies looking for proof of the efficacy and safety of their products). This doesn’t mean that science is merely an opinion—far from it—but it does show that science, and scientists, are not beyond all doubt and that their work deserves scrutiny. By other researchers, and perhaps also by the general public.
Today’s perspective, which understands science as the work of people of flesh and blood, is quite recent. For much of the 19th and 20th centuries, science was widely regarded as the only source of truth and a never-ending engine of technological and societal progress. The scientific method was thought to deliver facts, and those facts could only be countered by other scientists with new theories and empirical evidence. As a result, science can prove itself wrong and evolve, but only because of new and, arguably, better science. Until proven otherwise, scientists were assumed to be objective observers of reality and hence produced unbiased knowledge. Even philosophers of science, who studied the nature of science and the scientific method, rarely questioned the motivations of scientists or the biases they carried with them.
Ironically, today’s more critical perspective of science is largely a leftist invention. Starting with Thomas Kuhn, sociologists of science began to scratch the surface of the scientific system. These often progressive and left-leaning scholars have shown how science is subject to errors, biases, and political and financial interests. The modest community of "science and technology studies" (STS) has been at the forefront of this new take on science. STS scholars were among the first to critically study the work of scientists and to unpack the myths surrounding science and the exclusive role of formal experts in societal debates. Their targets were typically controversial technologies and the science underpinning them, such as nuclear energy and biotechnology. Their mission, in a sense, was to "deconstruct" the science behind these technologies and show how it was co-opted by political and industrial interests. More generally, they showed how science in general is the product of social construction (i.e., man-made ideas, instead of reality exposing itself) and how it is influenced by the biases of individual researchers, research groups, and the inherent politics of science (e.g., to get published and cited). In short, they made it clear that scientific knowledge is not infallible or above criticism. Moreover, they argued that scientific knowledge is not necessarily better or more valuable than other forms of knowledge, and that scientists do not necessarily possess more or better expertise than others, such as people with first-hand experience in specific matters. For example, when it comes to the effects of noise pollution near airports, one should not only listen to experts on noise pollution but also to those living in the area and their experiences with the noise. This is the principle of symmetry; no one’s expertise is a priori more valuable than someone else’s.
This principle is meant to open public debates so that others besides scientists or formal experts are also heard. Often, symmetry in expertise is part of a broader struggle to break down traditional hierarchies, fix social inequalities, and empower the underdog. For instance, giving voice to women in medical debates where their first-hand experience and interests were once subordinate to male doctors and scientists. Or farmers who experienced the toxic effects of pesticides first-hand, while those effects were denied by scientists paid by pesticide producers.
However well-intentioned and justified this principle may be, today it can easily be viewed in a different light. After all, the principle of symmetry sounds a bit like "science is just an opinion." Indeed, with the rise of post-truth politics and talk of a post-truth society, the STS community has started to question whether they are to blame for all of this. More specifically, they ask whether their ideas and arguments have been hijacked by intellectual terrorists who use them for their own selfish interests. The answer may be yes.
While most of the work of STS scholars hardly escapes the confines of peer-reviewed journals and academic conferences, their ideas about symmetry surfaced during the so-called "Science Wars" of the 1990s. During this period, natural scientists and STS researchers (as well as postmodernist philosophers) fiercely debated the nature and value of science and the sense and nonsense of the principle of symmetry. This debate took place not only in journals and at conferences but also in newspapers and politics. After some years, both sides agreed to disagree, and public attention for the Science Wars waned, but it may have left a mark on the reputation of science nonetheless. According to Bruno Latour, one of the most prominent thinkers in STS and the Science Wars, this period may very well have taught others how to criticize scientific findings and deconstruct well-established facts:
"Dangerous extremists are using the very same argument of social construction to destroy hard-won evidence that could save our lives. Was I wrong to participate in the invention of this field known as science studies? Is it enough to say that we didn’t really mean what we said?"
Latour was probably right in the sense that extremists use STS ideas and arguments. Yet, that does not mean that Latour and his colleagues are to blame for the post-truth society. All things considered, the principle of symmetry is very valuable and justified, and it should not be equated too easily with the saying that "science is just an opinion." Symmetry is, after all, not about the symmetry of opinions but about the symmetry of different forms of knowledge production and (genuine) expertise. The real question is thus what qualifies as true and honest expertise, and what is flawed, insincere, or "bullshit"? Moreover, who can and should be the arbiter of genuine expertise and, eventually, of truth?
It is impossible to answer these questions in absolute terms, but STS scholars have tried to come up with different answers. Some have argued that there is no way to make a distinction between good or bad science or genuine or false expertise. This argument makes the issue of expertise inherently political; one simply supports knowledge claims, and the scientists and experts who make those claims, with which one agrees. Latour was among these thinkers, as he vowed to use the tools of STS to defend the findings of climate science. Conversely, all claims one does not agree with should be deconstructed until there’s nothing left of them. Essentially, this is what happens in a post-truth society.
Others argue that we can still distinguish between good and bad forms of expertise and knowledge production by scrutinizing every claim and exposing the mistakes, biases, and politics involved. When there are two opposing claims (e.g., climate change is caused by human activity, or not), one should be able to decide which claim is the most robust. In most cases, particularly within empirical science, this robustness is assessed concerning the method and the data—for example, the strength of the mathematical modeling or the reliability of the data. However, this approach is only effective when both opposing claims share an underlying notion of truth or a ‘paradigm’ as Kuhn called it that aligns with these reflections on the procedure. That is, they accept that an arbiter is capable of good judgement. In the absence of common ground, especially between sciences and different methods, scrutiny mainly shifts towards revealing shortcomings, biases and political influences, aiming to deconstruct the opposing perspective.
Such an analysis would probably show that politics may be involved in “mainstream” climate science, but that this is far more often the case among climate deniers (e.g., because they are sponsored by industries or seek to make a name for themselves in fringe academics). The problem with this approach is that it still turns every dispute about the truth into a match or a court case that anyone can win. Moreover, it allows interested actors to maintain the appearance that scientific controversy continues, even when the dispute was settled long ago.
Unfortunately, this is what happens in a post-truth society. Even the most basic facts we hold true are questioned by fanatic disbelievers, or “critical citizens,” as they call themselves. Is the earth really a sphere, did men really set foot on the moon, was Churchill a bigger war criminal than Hitler? Those critical of well-established knowledge continue to produce and reproduce contrary evidence and, somehow, manage to sow a bit of doubt in all our minds. Yes, the earth is spherical, but the moon landings could be fake, and Churchill was responsible for the mass bombing of German cities. Decades ago, such doubts would have stood far less of a chance than today, and people were much more likely to accept the facts presented to them. No wonder less obvious truths are also under tough scrutiny, like the effectiveness of COVID vaccines, man-made climate change, or the benefits of sunscreen.
This is one of the great paradoxes of contemporary science. Challenging well-established truths and basic facts was once the domain of groundbreaking scientific minds like Copernicus, Darwin, and Einstein, whose stories are often retold as cliché tales of brave scientists confronting, non-scientific truths. Today, however, the situation appears almost reversed—it's as if your neighbors, and in a way all us, think they have a Copernicus in them, critically thinking for themselves and questioning the consensus against all odds. Have we become too critical or too skeptical?
This open-ended approach to symmetry, as a match between opposing knowledge claims, is thus also problematic. Not so much in principle, because everyone deserves to take part in societal debate and be heard, but it is certainly problematic in practice. How can we, for instance, ever get to work on mitigating climate change when the science behind climate action is continuously challenged? How can we expect people to use sunscreen when sunscreen-deniers are repeatedly taken seriously?
Perhaps it is up to the same scholars who invented the deconstruction of scientific knowledge to act as the arbiters of expertise. This is indeed what several STS scholars propose: to use their understanding of (proper) science and expertise to scrutinize knowledge claims and settle disputes when necessary. More specifically, to "reject the misuse of expertise by certain elite experts and give credit to the work of low-status, experience-based experts."
Regardless of whose task this should be, an understanding of expertise is necessary to prevent a real post-truth society from emerging. This is probably something all of us should develop, as a specific kind of critical thinking, much more fundamental than the typical kind of media literacy we often hear about in debates on fake news and disinformation. While the latter is an ill-disguised attempt to convince people to take seriously only the mainstream media and formal experts, a focus on genuine expertise invites people to look beyond the labels placed on knowledge claims. Instead of expecting people to believe everything written in government reports or established newspapers, we can teach them how to conduct proper research on their own. That should include critical thinking, an understanding of who is saying what, how it is substantiated, and which interests and politics are at play. This newly acquired expertise would allow people to listen to a wide range of ideas and truths and make up their own minds, instead of blindly following the mainstream media or falling too easily for populist narratives.