How Alan Turing Set the Rules for Computing
On Saturday, British mathematician Alan Turing would have turned 100 years old. It is barely fathomable to think that none of the computing power surrounding us today was around when he was born.
But without Turing's work, computers as we know them today simply would not exist, Robert Kahn, co-inventor of the TCP/IP protocols that run the Internet, said in an interview. Absent Turing, "the computing trajectory would have been entirely different, or at least delayed," he said.
For while the idea of a programmable computer has been around since at least 1837 -- when English mathematician Charles Babbage formulated the idea of his analytical engine -- Turing was the first to do the difficult work of mapping out the physics of how the digital universe would operate. And he did it using a single (theoretical) strip of infinite tape.
"Turing is so fundamental to so much of computer science that it is hard to do anything with computers that isn't some way influenced by his work," said Eric Brown, who was a member of the IBM team that built the "Jeopardy"-winning Watson supercomputer.
A polymath of the highest order, Turing left a list of achievements stretching far beyond the realm of computer science. During World War II, he was instrumental in cracking German encrypted messages, allowing the British to anticipate Germany's actions and ultimately help win the war. Using his mathematical chops, he also developed ideas in the field of non-linear biological theory, which paved the way for chaos and complexity theories. And to a lesser extent he is known for his sad demise, an apparent suicide after being persecuted by the British government for his homosexuality.
But it may be computer science where his legacy will be the most strongly felt. Last week, the Association of Computing Machinery held a two-day celebration of Turing, with the computer field's biggest luminaries--Vint Cerf, Ken Thompson, Alan C. Key--paying tribute to the man and his work.
Turing was not alone in thinking about computers in the early part of the past century. Mathematicians had been thinking about computable functions for some time. Turing drew from colleagues' work at Princeton University during the 1930s. There, Alonzo Church was defining Lambda calculus (which later formed the basis of the Lisp programming language). And Kurt Gödel worked on the incompleteness theory and recursive function theory. Turing employed the work of both mathematicians to create a conceptual computing machine.
His 1936 paper described what would later become known as the Turing Machine, or a-machine as he called it. In the paper, he described a theoretical operation that used an infinitely long piece of tape containing a series of symbols. A machine head could read the symbols on the tape as well as add its own symbols. It could move about to different parts of the tape, one symbol at a time.
"The Turing machine gave some ideas about what computation was, what it would mean to have a program," said James Hendler, a professor of computer science at the Rensselaer Polytechnic Institute and one of the instrumental researchers of the semantic Web. "Other people were thinking along similar lines, but Turing really put it in a formal perspective, where you could prove things about it."
On its own, a Turing Machine could never be implemented. For one, "infinite tapes are hard to come by," Kahn joked. But the concept proved invaluable for the ideas it introduced into the world. "Based on the logic of what was in the machine, Turing showed that any computable function could be calculated," Kahn said.
Today's computers, of course, use binary logic. A computer program can be thought of as an algorithm or set of algorithms that a compiler converts into a series of 1's and 0's. In essence, they operate exactly like the Turing Machine, absent the tape.
"It is generally accepted that the Turing Machine concept can be used to model anything a digital computer can do," explained Chrisila Pettey, who heads the Department of Computer Science at Middle Tennessee State University.
Thanks to Turing, "any algorithm that manipulates a finite set of symbols is considered a computational procedure," Pettey said in an interview via e-mail.
Conversely, anything that cannot be modeled in a Turing Machine could not run on a computer, which is vital information for software design. "If you know that your problem is intractable, and you don't have an exponential amount of time to wait for an answer, then you'd better focus on figuring out a way to find an acceptable alternative instead of wasting time trying to find the actual answer," Pettey said.
"It's not that computer scientists sit around proving things with Turing Machines, or even that we use Turing Machines to solve problems," Pettey said. "It's that how Turing Machines were used to classify problems has had a profound influence on how computer scientists approach problem solving."
How Alan Turing Set the Rules for...Next Page