Before I get too far ahead of myself, however (and electronic computers are quite far ahead), I will begin at the turn of the 20th century. A remarkably brilliant man named David Hilbert compiled a list of 23 unsolved mathematical problems in 1900. This list in many ways served as a sketched outline to guide theoretical research for the coming century, with a number of the problems remaining unsolved even today. Although many of Hilbert’s problems are fascinating in their own right (I’m sure all of them probably are, but I don’t actually understand a few of them), the one which is relevant to the discussion at hand is his tenth problem. Hilbert's tenth problem is ordinarily written as something along the lines of
Find an algorithm to determine if a given Diophantine equation with integer coefficients has an integer solution.However, this is not actually how Hilbert posed the problem, as the word algorithm had yet to enter useage (Webster's New World Dictionary, for example, did not include the word 'algorithm' prior to 1957). The actual original statement of the problem was something along the lines of:
Given a Diophantine equation with any number of unknown quantities and with rational integral numerical coefficients: To devise a process according to which it can be determined in a finite number of operations whether the equation is solvable in rational integers.Before a number of my readers balk at the term 'Diophantine equation', it simply means an indeterminate polynomial equation (for example, x + y = 5 is an Diophantine equation, since there is no single assignment of values that satisfies the equation. Rather, the solution is the line x = 5 - y).
As I mentioned, at the time that Hilbert originally posed this problem the term algorithm was not in use. An algorithm is essentially a process or sequence of instructions, but with a number of specific properties (namely, that it has a finite sequence of instructions and is well defined (at no point does it reach a state in execution in which it is not clear what it is to do next)). Informally, algorithms have been around for centuries. One of the most famous (and one that is likely familiar to any computer science student) is Euclid's algorithm for determining the greatest common divisor of two numbers. However, prior to Hilbert's problems, very few people had ever attempted to analyse the notion of abstract processes. Mathematicians and scientists developed and utilized specific mathematical methods for specific problems, but, for the most part, each one was developed and investigated within the context of its application alone. What changed all of this was that people began to question whether or not there actually existed the process that Hilbert's tenth problem was asking for. The fact that some problems could be unsolvable was a fairly ground-breaking notion, and provided the impetus to explore the general notion of solvable methods.
Although the actual proof that no such algorithm exists which can solve Hilbert's tenth problem did not come about until 1970 with the publication of Yuri Matiyasevich's doctoral dissertation, the inkling that it might be unsolvable began much earlier, particularly from the field of logic with Kurt Gödel's famous incompleteness theorems. Motivated by the growing question of solvability and mathematical truth, in 1928 Hilbert proposed another, more general algorithmic challenge called the Entscheidungsproblem (decision problem). The Entscheidungsproblem takes a formal language and a mathematical statement in that language as input and outputs whether or not it is true. The problem piqued the interest of both Church and Turing, motivating the two of them to independently formalize the concept of calculability in the mid 1930s. Both Church and Turing independently showed that the Entscheidungsproblem could not be solved. The models of computation that Church and Turing each used (λ-calculus and Turing machines, respectively) were subsequently shown to be equivalent, and thus the field of computability was formed.
I will discuss the actual material of computability in the next post on this topic, but I thought an historical overview might help illustrate the motivations behind the field as well as provide a somewhat less painful introduction for those not familiar with the terms. I am not sure I was entirely successful with the latter aim, but feel free to leave any comments or send me questions via email for parts that are not clear.