Volume 30 Number 12
Don't Get Me Started - Alan Turing and Ashley Madison
By David Platt | November 2015
What on earth could these names have to do with each other? One symbolizes our highest intellect, the other our most primal animal instincts. Are they not polar opposites? As always, Plattski notices similarities where no one else does.
No one on the Internet could miss the news that AshleyMadison.com (a Web site claiming to connect consenting adults for extra-marital affairs) got hacked and its list of subscribers published. Many users stupid enough to use their real identities had to scramble for cover. See, for example, the article “My Pastor Is on the Ashley Madison List,” by Ed Stetzer in Christianity Today (bit.ly/1NOHjsn). The rest of the world chuckled with schadenfreude, as you, dear reader, are doing right now. (No? Liar.)
But the real surprise is that, according to Annalee Newitz at Gizmodo.com, most of the male subscribers were corresponding not with actual female people, but with automated conversation bots. Once a user set up a free account, these bots initiated conversations and sent messages, trying to tempt the guy to upgrade to a paid membership. Writes Newitz (bit.ly/1OULHqw): “Ashley Madison created more than 70,000 female bots to send male users millions of fake messages, hoping to create the illusion of a vast playland of available women.” (Now you’re chuckling, and don’t tell me you’re not.)
The notion of a conversation bot that can fool human subjects has been around for a while. You’ve probably seen the phony psychiatrist program Eliza, first developed by Joseph Weizenbaum in 1966, which bluffed its way through the role of a psychotherapist simply by matching patterns in its input and changing verb conjugations. (Patient: “I am unhappy.” Eliza: “Did you come to me because you are unhappy?” One wonders exactly what percentage of human-delivered psychotherapy works this same way.)
We geeks all know the Turing test, proposed by Turing himself to detect artificial intelligence. Have a conversation over a terminal, and if you can’t tell whether you’re talking to a human or a program, then that program is intelligent. In a 2014 contest sponsored by the Royal Society of London, the chatterbot Eugene Goostman managed to fool 33 percent of the judges in a 5-minute conversation.
But the calculus changes when the human subjects want to be deceived. As Weizenbaum wrote in his original 1966 paper describing the Eliza experiment, subjects actively participated in their own deception (bit.ly/1G6UAGb). The user’s belief that the bot is a real person, Weizenbaum wrote, “serves the speaker to maintain his sense of being heard and understood. The speaker further defends his impression (which even in real life may be illusory) by attributing to his conversational partner [the computer] all sorts of background knowledge, insights and reasoning ability.” In short, some subjects refused to believe Eliza was faking her responses, even after they were told.
The same applies to Ashley Madison. The customers wanted to find something enticing, so find it they did. Hooray! We’ve passed the Turing test, at least for desperate guys being told what they sorely want to hear.
The Ashley Madison bots communicated by written messages. But with large-scale voice recognition now in the mainstream (see Siri and Cortana), I foresee the tipping point at which bots replace human adult phone workers. Imagine: Customer (speaking on phone): “I have this thing for waffle irons.” Audio chatterbot: “Did you come to me because you have this thing for waffle irons?” You’ll be able to choose the bot’s output voice, as you can today for your GPS directions—how about phone calls from Taylor Swift, Kathleen Turner or James Earl Jones? French accent? Mais oui. One wonders how long until supercomputers provide on-demand real-time video synthesis, as supercomputer Mike Holmes did with his human video appearance in the Robert Heinlein novel, “The Moon Is a Harsh Mistress.” The 2013 movie “She,” in which a user falls in love with his digital assistant, shows the logical progression of this idea. (I could develop it toward peripheral devices, but my editor won’t let me. “Hold it right there,” he says. To which I reply, “Precisely.”)
Let’s face it: If we’re looking for intelligence, using one’s real identity on an inherently shady site is a counter marker. Maybe we should think of the Ashley Madison leak as a reverse Turing test.
David S. Platt teaches programming .NET at Harvard University Extension School and at companies all over the world. He’s the author of 11 programming books, including “Why Software Sucks” (Addison-Wesley Professional, 2006) and “Introducing Microsoft .NET” (Microsoft Press, 2002). Microsoft named him a Software Legend in 2002. He wonders whether he should tape down two of his daughter’s fingers so she learns how to count in octal. You can contact him at rollthunder.com.