ELIZA effect


The ELIZA effect, in computer science, is the tendency to unconsciously assume computer behaviors are analogous to human behaviors; that is, anthropomorphisation.

Overview

In its specific form, the ELIZA effect refers only to "the susceptibility of people to read far more understanding than is warranted into strings of symbols—especially words—strung together by computers". A trivial example of the specific form of the Eliza effect, given by Douglas Hofstadter, involves an automated teller machine which displays the words "THANK YOU" at the end of a transaction. A casual observer might think that the machine is actually expressing gratitude; however, the machine is only printing a preprogrammed string of symbols.
More generally, the ELIZA effect describes any situation where, based solely on a system's output, users perceive computer systems as having "intrinsic qualities and abilities which the software controlling the cannot possibly achieve" or "assume that reflect a greater causality than they actually do". In both its specific and general forms, the ELIZA effect is notable for occurring even when users of the system are aware of the determinate nature of output produced by the system. From a psychological standpoint, the ELIZA effect is the result of a subtle cognitive dissonance between the user's awareness of programming limitations and their behavior towards the output of the program. The discovery of the ELIZA effect was an important development in artificial intelligence, demonstrating the principle of using social engineering rather than explicit programming to pass a Turing test.

Origin

The effect is named for the 1966 chatterbot ELIZA, developed by MIT computer scientist Joseph Weizenbaum. When executing Weizenbaum's DOCTOR script, ELIZA parodied a Rogerian psychotherapist, largely by rephrasing the "patients replies as questions:
Though designed strictly as a mechanism to support "natural language conversation" with a computer, ELIZA's DOCTOR script was found to be surprisingly successful in eliciting emotional responses from users who, in the course of interacting with the program, began to ascribe understanding and motivation to the program's output. As Weizenbaum later wrote, "I had not realized... that extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people." Indeed, ELIZA's code had not been designed to evoke this reaction in the first place. Upon observation, researchers discovered users unconsciously assuming ELIZA's questions implied interest and emotional involvement in the topics discussed, even when they consciously knew that ELIZA did not simulate emotion.