Oxford University Press's
Academic Insights for the Thinking World

The most human computer?

By Dennis Baron

Each year there’s a contest at the University of Exeter to find the most human computer. Not the computer that looks most like you and me, or the computer that can beat all comers on Jeopardy, but the one that can convince you that you’re talking to another human being instead of a machine.

To be considered most human, the computer has to pass a Turing test, named after the British mathematician Alan Turing, who suggested that if someone talking to another person and to a computer couldn’t tell which was which, then that computer could be said to think. And thinking, in turn, is a sign of being human.

Contest judges don’t actually talk with the computers, they exchange chat messages with a computer and a volunteer, then try to identify which of the two is the human. A computer that convinces enough judges that it’s human wins the solid gold Loebner medal and the $100,000 prize that accompanies it, or at least its programmer does.

Here are some excerpts from the 2011 contest rules to show how the test works:

Judges will begin each round by making initial comments with the entities. Upon receiving an utterance from a judge, the entities will respond. Judges will continue interacting with the entities for 25 minutes. At the conclusion of the 25 minutes, each judge will declare one of the two entities to be the human.

At the completion of the contest, Judges will rank all participants on “humanness.”

If any entry fools two or more judges comparing two or more humans into thinking that the entry is the human, the $25,000 and Silver Medal will be awarded to the submitter(s) of the entry and the contest will move to the Audio Visual Input $100,000 Gold Medal level.

Notice that both the computer entrants and the human volunteers are referred to in these rules as “entities,” a word calculated to eliminate any pro-human bias among the judges, not that such a bias exists in the world of Artificial Intelligence. In addition, the computers are called “participants,” which actually gives a bump to the machines, since it’s a term that’s usually reserved for human contestants. Since the rules sound like they were written by a computer, not by a human, passing the Turing test should be a snap for any halfway decent programmer.

But even though these Turing competitions have been staged since 1991, when computer scientist Hugh Loebner first offered the Loebner medal for the most human computer, so far no computer has claimed the gold, though one computer came close enough a few years ago to silver. To keep things interesting, each year the most human computer, the one that comes closest to fooling the judges into thinking that they’re chatting with a human, gets a bronze medal and a $4000 prize, a sort of Miss Congeniality for the AI set.

When Alan Turing first proposed the Turing test in 1950, “electronic brains” were popularly viewed, not as high-speed calculators, but as the stuff of science fiction. Even the computer that was supposed to generate appropriate questions for the contestants on the 1950s TV quiz show, The $64,000 Question, turned out to be fictional: it was nothing but an IBM card sorter, an overpriced shuffling machine that generated nothing, since many of the show’s human contestants had already received the answers to the questions they would be asked in advance.

Computers have come a long way since Hal, the computer in 2001: A Space Odyssey, tried to hijack its own space ship, but although computers have yet to carry off the honors on Turing’s talk-like-a-human day, talking computers greet us every day at every turn, answering corporate phones (“Listen carefully, because our options have changed”) and offering technical support (“I’m sorry, I didn’t get that. Could you say or touch your account number again?”), announcing elevator floors (“Thirteenth floor. Mind the doors, please”), or giving GPS directions (“Turn left in 200 ft. Turn left in 100 ft. Turn left now. Please turn around and turn right in 200 ft.”). It’s like everywhere we turn, there’s a computer auditioning for a Turing test.

Maybe these computers can’t actually think, but even without independent thought they’ve become most nearly human enough to take over our lives: we can’t drive our cars without computers, or do the laundry, cook our food or refrigerate it, tune our televisions or record shows for later viewing. We can’t make phone calls without computers. We can’t even write about computers without computers.

OK, maybe we’re not completely fooled by computers referring to themselves in the first person and offering to connect us with the next available operator (“I’d be happy to connect you. Please enter your four-digit PIN number, followed by the pound sign. If you would rather continue using our automated system . . . .”). And maybe “the most human computer” sounds more like a children’s story than a bronze-medal-worthy string of code (“Ned, Sally, and their dog Spot were skipping by the ATM when all of a sudden it started to speak. ‘Hi,’ said the ATM. ‘Holy s***,’ said Ned. Spot yelped and ran away.”).

So, while programmers scramble to develop a computer that can pass the real Turing test, we can look for the most human computer another way: by placing an infinite number of monkeys at an infinite number of keyboards, if we wait long enough eventually one of them will trick a panel of judges into thinking they’re communicating with a human by typing out, not Hamlet, but HamBasic, on the computer screen. And as we wait, we can watch our own prose and that of our computers continue to converge, so that, as computers sound more and more like us, we begin to sound more like them as well. We are experiencing unusually heavy user traffic today, so please be patient. Don’t log off. Your request will be handled in the order in which it was received.

Dennis Baron is Professor of English and Linguistics at the University of Illinois. His book, A Better Pencil: Readers, Writers, and the Digital Revolution, looks at the evolution of communication technology, from pencils to pixels. You can view his previous OUPblog posts here or read more on his personal site, The Web of Language, where this article originally appeared. Until next time, keep up with Professor Baron on Twitter: @DrGrammar.

View more about this book on the

Recent Comments

There are currently no comments.