Like many scientists I have been dismayed of late to observe just how much “post-fact” or “alternative-truth” the world I live in has become. Like many scientists I have been reflecting a lot lately on what it means to be a scientist living in a post-fact world. Has my job and expertise become completely irrelevant? As regular readers know when I say I’ve been reflecting, you should expect a long post (or series of posts). I have broken my thoughts up into a 3 part series. In the first I take a historical perspective and argue that this post-fact world is really the result of trends in society going on for decades. In the second part, I turn to our social scientist colleagues who have been studying how people think and choose behavior and highlight some of the most salient points that they have learned to explain why we are in a post-fact world. In the third part, I attempt (and attempt is still a pretty strong word, struggle might be better) to draw conclusions from this about what a scientist should do in a post-fact world.
This is of course an ambitious undertaking. Even an arrogant undertaking. Many caveats apply. I am neither a historian of science nor a psychologist nor risk behavior specialist. For sure, almost none of my ideas are original, but despite that, I am not going to carefully footnote each source (this being an informal blog). And I am indubitably biased in having a primarily American perspective on this (although I try to include global examples and believe the general trends are global). In short, this is just another crackpot musing on the internet. Warning: my thoughts run a little contrary to the directions I have heard many scientists heading. But the primary value of this blog to me is the chance to throw ideas out there and hear other peoples thoughts on them. So here goes …
A post-fact world: Part I – on how we got here
It is fashionable to say that the post-fact world is a recent creation (no less an intellectual than Francis Fukuyama says it happened in 2016 although he acknowledges the seeds have been there for a decade or two. I personally think it goes back way further than that. I was born in 1966. And although you could fuzz things forward or backward by a decade, that is pretty close to the absolute high point in the esteem with which science was held by society. So I think it is accurate to say that we have been moving to a post-fact world all of my 51 years, and that some of the seeds were planted immediately post-World War II taking us back 70 years.
The rise of science’s reputation
I don’t know of formal polls or ways to indicate this, but my sense is that through the late 1800s and the first half of the 20th century, the most respected group was “captains of industry”. IE business people. This is not to say there weren’t blots on the reputation of business (the Teddy Roosevelt-era rise of conservation was driven by a realization that business unchecked could despoil enormously large tracts of land or fisheries in a non-sustainable fashion).More generally, in these times there was also trust in the establishment. Government leaders were by and large held in esteem, as were the police, the clergy and etc. Science wasn’t held in low esteem but the prototypical image was Einstein – a social misfit working on theories that required enormous intelligence but weren’t necessarily useful to the average person. Inventor/entrepreneurs like Thomas Edison were seen as more improving everyday life than scientists.
In the 1950s and 1960s though, the reputation of science and scientists was clearly on the rise. It is not too much of an exaggeration to say that science played a pivotal role in the Allies winning World War II. The invention of radar, the atomic bomb and antibiotics (World War II was the first war in which more soldiers died of injuries than disease) was huge. The secrets of biology were seemingly becoming tractable with the discovery of DNA as the genetic code and of the basic roles of proteins. The space race captured national attention (the first unmanned orbits and landings on the moon occurred in the 1960s with the first manned landing in 1969), it led to all kinds of science and innovation (modern computers included). Medical improvements to duration and quality of life were coming at break-neck speed. In 1971 Richard Nixon, declared our first War on Cancer. Closer to the themes of this blog, Platt published his 1964 paper on strong inference, implying it was easy to adopt a method to make science progress decisively and rapidly. Business and science were best friends. Bell labs (a private company), discovered the transistor and launched the electronic revolution, but they also made basic science discoveries like the microwave cosmic background radiation – a key piece of evidence for the big bang. The moon landing and the transistor together launched the computer revolution. Every self respecting company invested heavily in research and development with legendary labs where the line between pure and applied science blurred (Bell labs is the most famous, but Xerox PARC invented the laser printer and the graphical user interface with a mouse, and IBM labs won two Nobel prizes as well as inventing the barcode and the relational database). Pop-culture elevated scientists. In James Bond movies, scientists (and engineers) were pivotal for both the good guys (every movie required a visit to Agent Q) and the bad guys (who mostly were using some modern technology to try to rule the world). Television commercials regularly featured scientist/engineers in lab coats endorsing products (even mundane products like soaps). Dupont’s advertising slogan was “Better Things for Better Living … Through Chemistry”. The world was headed to better places, and science was leading the way.
This period and the rising reputation of science was highly consequential to the funding of science. Through World War II, all government investment in science was for military applications – non-military applications were thought to be clearly the domain of private enterprise. As World War II was winding down, Vannevar Bush, then the head of the US Office of Scientific Research and Development (despite its generic name this was an arm of the military) was asked to draft a plan for the future of government funded science. Vannevar Bush made a forceful case for “blue sky” research, essentially concluding that today’s basic research would quickly become tomorrow’s technology improving our lives and the National Science Foundation was created by Congress in 1950 under the advocacy of Vannevar Bush. This is not a moment to pass by too quickly. Two fundamental shifts were incorporated. First all sciences, not just the physical sciences, was valued. Second, basic research and not just applied research and not just applied research for the military, was valued and a worthy role for the government. I am less aware of the details in other countries but this shift from a government role in just applied military technology to basic research in all sciences is roughly parallel in most countries.For most of the next 60 years, funding basic science research was agreed to bi-partisanally as an important role for government and a core good for society. The modern research university emerged in this same period. Partly as a reflection of federal funding for research, but also a state-level investment in research and education. President Truman founded a Science Advisory Council with its chair serving as the President’s Science Adviser in 1951. With one brief exception, this institution has existed to the president day, and even in that one brief exception (Nixon being cranky/lazy), congress overrode him and recreated the office by congressional mandate.
The decline of science’s reputation
As so often happens in the world, just as science’s reputation in society was reaching its apex, the seeds of decline were already being laid. And Nixon’s letting the President’s Science Advisory Committee lapse (even though as best I can tell it was because he was distracted by a little thing called Watergate) might well be seen as a harbinger of things to come with 20-20 hindsight. The decline of science’s reputation is extraordinarily complex, and I’m sure any account of it is oversimplistic. But I’m going to organize around two core ideas: 1) business and science coming into conflict, and 2) social trends (including post-modernism, the 1960s counter-culture movement, and the internet).
Peter Kareiva says businesses are the keystone species in the human ecosystem. Or if you prefer, the adage “follow the money” has shown true time and again. The rise of science was surely because government fell in love with science, but it was equally and maybe more so because business also fell in love with science (as it led to discoveries that led to profits). Medical discoveries have made many businesses rich. Computers have invited a massive new industry and product domain (and a major reason for the rises in worker productivity for almost 50 years). The atom bomb was not directly profitable (thank goodness) but nuclear power made a lot of money for big business. Putting humans on the moon had so many spin-offs that made money you cannot even count them, but they range from the fundamental like miniature computers to the absurd but commercially successful like Tang (a powdered drink that supposedly tasted like orange juice but mostly just was a source of vitamin C that took off commercially when it was used by astronauts). But this love affair was tested with the discovery that cigarettes caused cancer through the 1950s and 1960s. Then Rachel Carson published Silent Spring in 1961 highlighting the fact that chemistry could destroy the health of people and the planet. Thousands of other chemicals were discovered as cancer causing. Satellites in space began to detect in vivid pictures deforestation, eutrophication, etc. It became clear that science could be the enemy of business as well as the friend of business. And, confirming the pivotal role of business in society (for good and bad), science was no longer an unalloyed good.
The classic example, beautifully dissected by Naomi Oreskes in her book and later movie, Merchants of Doubt, is the growing scientific evidence that smoking causes cancer – something with scientific roots back to the first half of the 20th century, but rapidly coming to the fore in the 1950s and 1960s. The tobacco companies colluded to hide research and developed a very careful strategy to resist. The key element is they realized they didn’t have to disprove the emerging research, just create an aura of doubt and distraction. Human foibles would do the rest. This strategy has since been adopted against every major scientific result that threatens the profit-motive since, albeit rarely at the scale and level of effrontery of the tobacco industry. As Oreskes nicely documents it is not just the same techniques but actually the same people who are attacking the science of climate change now. Many of the key players were physicists who emigrated to the US from Russia (and also in some cases who played roles in the Manhattan project to develop the atomic bomb). These physicists were profoundly ideologically motivated in a pro-capitalist direction by their experiences with communist Russia and were willing to align their scientific credentials, albeit almost irrelevant scientific experience (physicists talking about biology), to resist the “attack” on tobacco corporations. These same ideologically motivated individuals perceived a frightening anti-capitalist bent in the emerging 1970s environmental movement and were happy to be recruited against environmentalism as well (again with limited relevant knowledge).
I think you can see the determinative role of business in how science is perceived in a few places. First, think about cancer. Business has always been behind Nixon’s War on Cancer and the ensuing billions spent at NIH, and now on Joe Biden’s cancer moonshot. That is because these approaches lead to medical treatments, often very expensive medical treatments, that can be sold. Meanwhile Silent Spring, one of the first publicizations of growing research on the cancer-causing roles of chemicals like pesticides, was received with a monumental thud by business. To this day, it is easy to get science to mobilize and be excited about a new chemotherapy drug that costs $20,000; but it is very hard to get society and regulations mobilized on taking completely unnecessary carcinogens out of our living environments (almost certain to save more lives for fewer dollars) It is hard to view that difference through any lens but the profit motives of companies*.
Or consider the very different responses to the ozone hole problem and global warming and the keystone role (both positively and negatively) of corporations. This is nicely summarized by Maxwell and Briscoe. The first science on possible problems of the use of chloro-flouro-carbons (CFCs) in refrigeration and other industrial applications appeared in the 1970s. In 1982 the US National Academy of Sciences published a damning report. By 1987 an international agreement fixing the problem (the Montreal Protocol) had been signed. This is less than two decades from first science to fix. This contrasts noticeably with climate change with first science going back to Arrhenius and the 1920s and resolution only beginning to appear in the 2010s with many battles still to fight. A central key to this rapid transition was the role of Dupont corporation, the major producer of freon, the dominant CFC in a multi-billion dollar industry. Through an interesting mix of ethics and profits, and very likely not want to repeat the smoking-causes cancer debacle of tobacco companies, Dupont saw the writing on the wall around CFCs and worked on developing alternatives (the fact that a key patent expired in 1979 didn’t hurt either). And Dupont while not exactly championing the move away from CFCs did not fight it like some other US and all European producers. And they profited handsomely from being ready to step in with the alternatives. My point here is not to praise nor blame a single company, yet the role of a single corporation is inextricably interwoven with whether the CFC problem got fixed quickly or slowly. And there is little doubt of the role of corporations in slowing down the fix to climate change. It won’t shock anybody to learn that just a handful of oil companies combine to spend over $100,000,000/year to fight climate change and that transportation and utility companies are not far behind.
So hopefully I have made the case that business has a very strong influence on how science turns into policy and that science can quickly influence policy when it benefits business but that businesses can effectively kill science from turning into policy when it is against their interests. As a simplistic rule, businesses like physics, chemistry and biomedical research and don’t like public health and ecology/environmental research. This distinction matters a lot.
Now I want to turn to the second major trend that I think has knocked science off of its 1960s pedestal, a series of social-intellectual movements. The first of these is the rise of postmodernism (and its attack on science), something that is usually said to have started in post-World War II (1950s). Wikipedia defines postmodernism as “an attitude of skepticism, irony or distrust toward grand narratives, ideologies and various tenets of Enlightenment rationality, including notions of human nature, social progress, objective reality and morality, absolute truth, and reason. Instead, it asserts that claims to knowledge and truth are products of unique social, historical or political discourses and interpretations, and are therefore contextual and constructed to varying degrees. Accordingly, postmodern thought is broadly characterized by tendencies to epistemological and moral relativism, pluralism, irreverence and self-referentiality.” If that doesn’t summarize the post-fact age we live in, I don’t know what does!
If I were to overly simplistically summarize it, postmodernism says “there are no authorities, anybody can think anything they want, and they’re all biased anyway”. The post-modern version of philosophy of science is captured by Feyerabend, who essentially said there is no objective truth, truth is relative, and scientists are just acting out games to enhance their power. But more broadly there has been a measurable decline according to polls of trust in any number of institutions since the 1950s. Journalism, government, and clergy have all suffered. The notion of universal values manifesting in institutions pursuing the common good has taken a real hit. The counter-culture movement of the 1960s and the Watergate exposure of government corruption in the 1970s are part of this. The way the internet enables you to find “your own people” and treats all sources of information as equal and the resulting “living in a bubble” phenomenon are part of this too. As a scientist and a person I have very mixed feelings about this. Questioning authority, free flow of information and independent thinking are clearly good, but it is possible to go too far, and the current US political climate is showing this. In this post-modern trend, I don’t think science has been the primary target (aside from some social scientists in academia like Feyerabend). Science has just been a bystander casualty. But when all institutions (and individuals) are suspect and everybody’s opinion is equally valid and biased, it is very hard for science to play the role it once did in policy.
And of course the fact that businesses dislike some forms of science interacts powerfully with the overall postmodern trend. When companies feel disadvantaged by scientific results, its much easier to use the favored tactic of sowing uncertainty and doubt in a post-fact, every person-their own opinion world than it was in the “trust anybody in a white coat” world. This is playing out very clearly in, e.g., climate change denial.
And not to let academia off the hook, I think it is fair to say that universities have changed from places that seek the truth and teach students how to seek to the truth to places that seek research and student tuition dollars. This has led to a worrying kowtowing to students because challenging students ideas might turn away their dollars. Others would, with justification, pinpoint social media and traditional media trends – if people only exchange ideas with people who agree and don’t challenge them, it is a formula for overconfidence.
So in summary: science reached a high point in societal esteem around the 1960s and has been gradually sliding off the pedestal ever since. Part of that is a switch from science being purely a money maker to sometimes a trouble maker for business; business responded strongly with a tactic centered on creating doubt. Another part of this is a broadest sweep societal trend towards post-modern denial of authority and expertise, replaced by absolute subjectivism and hence equality of all opinions. One can invoke other factors too, like the changing nature of universities, the internet, and the increase in the number of channels on TVs. But however you dissect it, the societal esteem for science has suffered both from societal trends and a specific change in the relationship to business. And the key point is this has been coming for decades, indeed more than half a century. Its not surprising scientists are frustrated and angry at the new role we’ve been given (or more precisely the great role we have lost). This frustration and longing for a return to past glory is eminently understandable. But the trends are so broad brush and long term that I think it would be naive to assume they can be reversed. Instead we scientists are going to have to steer to somewhere new and define a new relationship to society and policy.
So what do you think? Is the changing relationship of science with business a key part of the story? Are post-modernist trends, enabled by media trends, a key part of the story? Is this a 50 year trend? If it is a 50 year trend, is it reversible?
* Although I hasten to add I am not opposed to the high-tech big business battle on cancer either.