Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
March is Women's History Month!

Setting The Record Straight For Alan Turing

Alan M Turing and colleagues work on the Ferranti Mark I Computer in the United Kingdom in 1951.
Science & Society Picture Library via Getty Images
Alan M Turing and colleagues work on the Ferranti Mark I Computer in the United Kingdom in 1951.

Imagine, for a moment, that Albert Einstein's greatest contributions were kept secret at the highest levels of government. Imagine, for a moment, that while still relatively young, Einstein was prosecuted, shamed and driven to suicide for the inclinations of his affections. Imagine, for a moment, that in the wake of the secrecy, the shame and the suicide, you never knew Albert Einstein's name.

Seems crazy, doesn't it? In many ways, however, that narrative is the story of Alan Turing. Thankfully, it's a story that is finally getting aired in popular culture through the new film The Imitation Game.

Until relatively recently, most folks wouldn't come across Turing's name unless they had a certain kind of computational orientation. "Turing" doesn't ring the same bells as Einstein, Newton, Darwin or even Heisenberg, Watson and Crick. But, without doubt, Alan Turing should be on their list of science giants.

It's not just that Turing's work was worthy of a Nobel Prize. He went far beyond that. Turing possessed an epoch-making genius of the highest order — and his impact on human civilization is in line with the heights that kind of genius yields. That's why Turing's omission from everyone's list of super-scientists is so galling. But worse, still, are the circumstances of that omission's occurrence, driven by a confluence of two remarkable factors — an accident of history and pure narrow-minded fear.

The accident was World War II. To be more explicit, it was the fact that Turing played a decisive role in winning that war through his hyper-mega-top-secret work in cryptography. Turing's work deciphering German codes was kept utterly invisible to the rest of the world after the conflict ended. Thus, the man who helped shave two years off one of the bloodiest wars in history never became a household word (like "Oppenheimer" or "Patton" or "Eisenhower").

But the real tragedy of Turing rests with the "narrow-minded fear" part of the story. Alan Turing was gay at a time when this was a punishable crime in England. Arrested and shamed for his relationship with another man, Turing was forced into "chemical castration" in 1953. A year later, at the age of 41, Turing committed suicide

We lost a lot in losing Alan Turing to homophobia. But to be clear, let's take a few moments to understand what he managed to accomplish when he wasn't saving western civilization from fascism.

In 1935, at the ripe age of 22, Turing devised the abstract mathematical background to define a computing machine. Now called a "Turing machine," it would sequentially respond to input and generate output in a step-by-step (i.e., algorithmic) fashion. Turing machines are the essence of every device with a chip in it that you have ever encountered. That's why Turing stands, essentially, at the head of the line when it comes to the creation of the digital age. He is the father of all computers.

Turing's interest in "thinking machines" continued after his early studies. Part of the triumph of his work during World War II was developing electromechanical devices to crack the supposedly un-crackable German Enigma coding machines. After the war he led Britain's effort to create a true "electronic" computer, and in his later theoretical work he took the first steps toward what is now called neural-network computing.

If initiating the digital revolution were all there was to Alan Turing, that would be enough to warrant his name being universally recognizable. But Turning's genius went deeper still. Turing didn't just define computers, he defined computing in its deepest, most cosmic sense.

Turing's work developing the idea of a Turing machine was part of larger project: defining the very limits of mathematics. It's a story that begins with the Greeks millennia ago but takes its sharp focus in 1900 with the legendary German mathematican David Hilbert. Hilbert had set the agenda for his entire field by tasking mathematicians to express all mathematics in the form of a consistent, complete and decidable "formal" system.

As philosopher Jack Copeland explains it, Hilbert's goal was transcendent:

"A consistent system is one that contains no contradictions; 'complete' means that every true mathematical statement is provable in the system; and 'decidable' means that there is an effective method for telling, of each mathematical statement, whether or not the statement is provable in the system. Hilbert's point was that if we came to possess such a formal system, then ignorance would be banished from mathematics forever."

Since mathematics is the basis of all science, what we would have — at root, at least — was a model for perfect knowledge.

But in 1931, Kurt Gödel famously proved that no formal mathematical system could be consistent and complete. Then, in 1936, it was Turing who used his abstract Turing machines to show that decidability was impossible, too. Thus Turing (along with Alonzo Church) played a decisive role in showing that the most ancient human dream of perfect, absolute and axiomatic knowledge was exactly that — just a dream. Only a mind of the highest and most subtle understanding could have achieved such insight.

In the modern era, Turing's essential understanding of computation and knowledge has found new applications in cosmology, of all places. Many physicists have come to see the universe as a whole as a kind of giant information processing system — and have used Turing machine concepts in the work. Thus, even fundamental physics has embraced the fundamental importance of Turing's insights.

But we lost Alan Turing at age 41 because too many people were uncomfortable with whom he was inclined to love. Think about what might have happened if he had been around to witness the dawn of the personal computer or the Internet? How different might both those revolutions have been with his input?

There is a lot of discussion these days about the need for greater inclusion in sciences. Whether it's homophobia or sexism or racism or even the ability for poor kids to get access to science education, it's become clear that biases and barriers still exist. Sometimes they are blatant and sometimes they are subtle. What Alan Turing's story shows us is just how much we stand to lose when we fail to understand that genius, or just a good scientist, can appear anywhere and in any form.


You can keep up with more of what Adam is thinking on Facebook and Twitter: @adamfrank4.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Adam Frank was a contributor to the NPR blog 13.7: Cosmos & Culture. A professor at the University of Rochester, Frank is a theoretical/computational astrophysicist and currently heads a research group developing supercomputer code to study the formation and death of stars. Frank's research has also explored the evolution of newly born planets and the structure of clouds in the interstellar medium. Recently, he has begun work in the fields of astrobiology and network theory/data science. Frank also holds a joint appointment at the Laboratory for Laser Energetics, a Department of Energy fusion lab.