Must Science Murder Its Darlings?
"The great tragedy of Science," wrote Thomas Henry Huxley, is "the slaying of a beautiful hypothesis by an ugly fact."
Of course, part of what makes science so powerful is its very willingness to see its darlings go by the wayside. New facts come in, new ideas emerge and once-valued notions make their way from science texts to history books.
But it's rarely a single fact that does a beautiful theory in. More often it's the gradual accumulation of findings that don't sit quite right, the slow reconceptualization of what we thought we knew and the piecemeal emergence of alternative ways of making sense of some phenomenon. Most theories go out with a whimper, not with a bang.
So it's no trivial matter to know when a scientific idea is ripe for retirement — the very task that John Brockman set before dozens of thinkers for the 2014 "Annual Question" at Edge.org, an on-line "salon," of sorts, with the tradition of posing a yearly question since 1998. Past questions have included "How is the Internet changing the way you think?" in 2010, and "What should we be worried about?" just last year.
This year's question, "What scientific idea is ready for retirement?", generated over 170 responses from scientists, artists, scholars and public figures, including Alan Alda, Patricia Churchland, Paul Davies, Richard Dawkins, Daniel C. Dennett, Jared Diamond and Rebecca Goldstein (not to mention two of 13.7's own: me and Marcelo).
Responses spanned the sciences, with many questioning aspects of science itself. For example, writer Brian Christian took on current practices in scientific publishing, with others challenging a host of scientific ideals and assumptions: that scientific discovery should be valued over the lives of scientists (Kathryn Clancy), that scientists should stick to science (Buddhini Samarasinghe) and (inversely) that non-scientists can't do science (Kate Mills), that the hallmark of scientific claims is their falsifiability (Sean Carroll) and that there is "a" (singular) scientific method (Melanie Swan).
My own response questioned the reductionist idea that the mind is just the brain, arguing that explanations for behavior and mental states shouldn't be couched in strictly neuroscientific terms. In a related response, Joel Gold and Ian Gold urged that we do away with the idea that mental illness is nothing but brain illness. Instead, we should recognize that "understanding and treating brain disorders sometimes has to move outside the skull."
Our responses were just two of many to focus on psychology and the cognitive sciences. Quite a few essays, for example, took on formulations of nature versus nurture when it comes to human capacities and behaviors, an idea that — according to Timo Hanney — "truly deserves a bullet in the back of its head."
Steven Pinker questioned every term in the "genes + environment = behavior" equation. Alison Gopnik argued that when you take a close look, questions about what's innate can quickly become incoherent:
Of course, for a long time, people have pointed out that nature and nurture must interact for a particular trait to develop. But several recent scientific developments challenge the idea of innate traits in a deeper way. It isn't just that it's a little of both, some mix of nurture and nature, but that the distinction itself is fundamentally misconceived.
But other responses were directed more strongly towards one particular pole of the nature-nurture dichotomy, with Daniel L. Everett urging us to renounce appeals to "instinct" and "innate" knowledge on the one hand and Kiley Hamlin highlighting the dangers of assuming a blank slate when it comes to moral beliefs and behaviors, on the other.
In two independent essays broaching another aspect of cognition, Sarah-Jayne Blakemore and Stephen M. Kosslyn urged readers to abandon scientifically-unfounded claims about the left brain versus the right brain, a topic I recently covered here at 13.7.
Other notable contributions about the mind by psychologists included a piece by June Gruber arguing that "negative" emotions aren't always bad and "positive" emotions always good. And one by Tom Griffiths (full disclosure: he's my husband) arguing that bias isn't always a bad thing when it comes to human or artificial intelligence. Then there was a co-authored response by psychologist Laurie Santos and philosopher Tamar Gendler on the gulf between knowing about cognitive biases and avoiding them. Susan Fiske and Jamil Zaki wrote on human warmth and altruism.
One of the most interesting themes to emerge from this year's collection of responses to the Edge.org "Annual Question" was a meta-scientific point about the very nature of scientific progress and its relationship to human cognition. Many respondents argued that science shouldn't simply murder past darlings — that in fact, scientific ideas are often useful even when we know them to be false, in part because they foster a certain kind of understanding, or provide a standard against which we can understand a complex system:
The best scientific explanation of a phenomenon depends on where real human beings find comprehensible patterns in the universe, and not how the universe is constituted. (Joel Gold & Ian Gold)
In economics there are certainly many theories, hypotheses and models that are badly flawed descriptions of the behavior of economic agents, so one might think that I would have many nominations for ideas that should be given funerals. But I don't. That is because most of these theories, while demonstrably poor descriptions of reality, are extremely useful as theoretical baselines. As such, it would be a mistake to declare these theories dead... Let's keep these and many other wrong theories and hypotheses alive, but remember they are just hypotheses, not facts. (Richard H. Thaler)
Beware of arrogance! Retire nothing! A great and rich scientific tradition should hang onto everything it has. Truth is not the only measure. There are ways of being wrong that help others to be right. (Ian McEwan)
So while it can be a useful exercise to catalog scientific ideas with significant failings, it might be too soon to toll the funeral bells. Ideas that actually impede scientific progress may well deserve an early burial, but the vast majority of scientific ideas — guilty only of divergence from some ugly facts — can have their uses. As Ian McEwan concludes:
Science should look to literature and maintain a vibrant living history as a monument to ingenuity and persistence. We won't retire Shakespeare. Nor should we Bacon. (Ian McEwan)
You can keep up with more of what Tania Lombrozo is thinking on Twitter: @TaniaLombrozo