0% found this document useful (0 votes)
425 views9 pages

ELIZA

ELIZA was an early natural language processing program created in the 1960s at MIT to simulate conversation. It used pattern matching and substitution to analyze user inputs and generate responses, giving the illusion of understanding but without true comprehension. ELIZA became known for its DOCTOR script which responded to user inputs with non-directional questions, simulating a Rogerian psychotherapist. While convincing to some users, ELIZA had no real understanding and could not learn or improve beyond its original programming.

Uploaded by

sophia787
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
425 views9 pages

ELIZA

ELIZA was an early natural language processing program created in the 1960s at MIT to simulate conversation. It used pattern matching and substitution to analyze user inputs and generate responses, giving the illusion of understanding but without true comprehension. ELIZA became known for its DOCTOR script which responded to user inputs with non-directional questions, simulating a Rogerian psychotherapist. While convincing to some users, ELIZA had no real understanding and could not learn or improve beyond its original programming.

Uploaded by

sophia787
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

ELIZA

ELIZA is an early natural language processing


computer program created from 1964 to 1966[1] at
MIT by Joseph Weizenbaum.[2][3] Created to
explore communication between humans and
machines, ELIZA simulated conversation by
using a pattern matching and substitution
methodology that gave users an illusion of
understanding on the part of the program, but had
no representation that could be considered really
understanding what was being said by either
party.[4][5][6] Whereas the ELIZA program itself
was written (originally)[7] in MAD-SLIP, the
A conversation with Eliza
pattern matching directives that contained most of
its language capability were provided in separate
"scripts", represented in a lisp-like representation. The most famous script, DOCTOR, simulated a
psychotherapist of the Rogerian school (in which the therapist often reflects back the patient's words to the
patient),[8][9][10] and used rules, dictated in the script, to respond with non-directional questions to user
inputs. As such, ELIZA was one of the first chatterbots (“chatbot” modernly) and one of the first programs
capable of attempting the Turing test.[11]

ELIZA's creator, Weizenbaum, intended the program as a method to explore communication between
humans and machines. He was surprised and shocked that individuals, including Weizenbaum's secretary,
attributed human-like feelings to the computer program.[3] Many academics believed that the program
would be able to positively influence the lives of many people, particularly those with psychological issues,
and that it could aid doctors working on such patients' treatment.[3][12] While ELIZA was capable of
engaging in discourse, it could not converse with true understanding.[13] However, many early users were
convinced of ELIZA's intelligence and understanding, despite Weizenbaum's insistence to the contrary.[6]
The original ELIZA source-code had been missing since its creation in the 1960s as it was not common to
publish articles that included source code at this time. However, more recently the MAD-SLIP source-code
has now been discovered in the MIT archives and published on various platforms, such as archive.org.[14]
The source-code is of high historical interest as it demonstrates not only the specificity of programming
languages and techniques at that time, but also the beginning of software layering and abstraction as a
means of achieving sophisticated software programming.

Overview
Joseph Weizenbaum's ELIZA, running the DOCTOR script, was created to provide a parody of "the
responses of a non-directional psychotherapist in an initial psychiatric interview"[15] and to "demonstrate
that the communication between man and machine was superficial".[16] While ELIZA is best known for
acting in the manner of a psychotherapist, the speech patterns are due to the data and instructions supplied
by the DOCTOR script.[17] ELIZA itself examined the text for keywords, applied values to said keywords,
and transformed the input into an output; the script that ELIZA ran determined the keywords, set the values
of keywords, and set the rules of transformation for the output.[18] Weizenbaum chose to make the
DOCTOR script in the context of psychotherapy to "sidestep the problem of giving the program a data
base of real-world knowledge",[3] as in a Rogerian therapeutic
situation, the program had only to reflect back the patient's
statements.[3] The algorithms of DOCTOR allowed for a
deceptively intelligent response, which deceived many individuals
when first using the program.[19]

Weizenbaum named his program ELIZA after Eliza Doolittle, a


working-class character in George Bernard Shaw's Pygmalion.
According to Weizenbaum, ELIZA's ability to be "incrementally
improved" by various users made it similar to Eliza Doolittle,[18]
since Eliza Doolittle was taught to speak with an upper-class accent
in Shaw's play.[8][20] However, unlike the human character in
Shaw's play, ELIZA is incapable of learning new patterns of speech
or new words through interaction alone. Edits must be made
directly to ELIZA's active script in order to change the manner by
which the program operates.

Weizenbaum first implemented ELIZA in his own SLIP list-


processing language, where, depending upon the initial entries by A conversation between a human
the user, the illusion of human intelligence could appear, or be and ELIZA's DOCTOR script
dispelled through several interchanges. [2] Some of ELIZA's
responses were so convincing that Weizenbaum and several others
have anecdotes of users becoming emotionally attached to the program, occasionally forgetting that they
were conversing with a computer.[3] Weizenbaum's own secretary reportedly asked Weizenbaum to leave
the room so that she and ELIZA could have a real conversation. Weizenbaum was surprised by this, later
writing: "I had not realized ... that extremely short exposures to a relatively simple computer program could
induce powerful delusional thinking in quite normal people."[21]

In 1966, interactive computing (via a teletype) was new. It was 15 years before the personal computer
became familiar to the general public, and three decades before most people encountered attempts at natural
language processing in Internet services like Ask.com or PC help systems such as Microsoft Office Clippit.
Although those programs included years of research and work, ELIZA remains a milestone simply because
it was the first time a programmer had attempted such a human-machine interaction with the goal of
creating the illusion (however brief) of human–human interaction.

At the ICCC 1972 ELIZA was brought together with another early artificial-intelligence program named
PARRY for a computer-only conversation. While ELIZA was built to speak as a doctor, PARRY was
intended to simulate a patient with schizophrenia.[22]

Design
Weizenbaum originally wrote ELIZA in MAD-SLIP for CTSS on an IBM 7094, as a program to make
natural-language conversation possible with a computer.[23] To accomplish this, Weizenbaum identified
five "fundamental technical problems" for ELIZA to overcome: the identification of key words, the
discovery of a minimal context, the choice of appropriate transformations, the generation of responses in the
absence of key words, and the provision of an editing capability for ELIZA scripts.[18] Weizenbaum solved
these problems and made ELIZA such that it had no built-in contextual framework or universe of
discourse.[17] However, this required ELIZA to have a script of instructions on how to respond to inputs
from users.[6]
ELIZA starts its process of responding to an input by a user by first examining the text input for a
"keyword".[5] A "keyword" is a word designated as important by the acting ELIZA script, which assigns to
each keyword a precedence number, or a RANK, designed by the programmer.[13] If such words are
found, they are put into a "keystack", with the keyword of the highest RANK at the top. The input
sentence is then manipulated and transformed as the rule associated with the keyword of the highest RANK
directs.[18] For example, when the DOCTOR script encounters words such as "alike" or "same", it would
output a message pertaining to similarity, in this case "In what way?",[4] as these words had high
precedence number. This also demonstrates how certain words, as dictated by the script, can be
manipulated regardless of contextual considerations, such as switching first-person pronouns and second-
person pronouns and vice versa, as these too had high precedence numbers. Such words with high
precedence numbers are deemed superior to conversational patterns and are treated independently of
contextual patterns.

Following the first examination, the next step of the process is to apply an appropriate transformation rule,
which includes two parts: the "decomposition rule" and the "reassembly rule".[18] First, the input is
reviewed for syntactical patterns in order to establish the minimal context necessary to respond. Using the
keywords and other nearby words from the input, different disassembly rules are tested until an appropriate
pattern is found. Using the script's rules, the sentence is then "dismantled" and arranged into sections of the
component parts as the "decomposition rule for the highest-ranking keyword" dictates. The example that
Weizenbaum gives is the input "You are very helpful", which is transformed to "I are very helpful". This is
then broken into (1) empty (2) "I" (3) "are" (4) "very helpful". The decomposition rule has broken the
phrase into four small segments that contain both the keywords and the information in the sentence.[18]

The decomposition rule then designates a particular reassembly rule, or set of reassembly rules, to follow
when reconstructing the sentence.[5] The reassembly rule takes the fragments of the input that the
decomposition rule had created, rearranges them, and adds in programmed words to create a response.
Using Weizenbaum's example previously stated, such a reassembly rule would take the fragments and
apply them to the phrase "What makes you think I am (4)", which would result in "What makes you think I
am very helpful?". This example is rather simple, since depending upon the disassembly rule, the output
could be significantly more complex and use more of the input from the user. However, from this
reassembly, ELIZA then sends the constructed sentence to the user in the form of text on the screen.[18]

These steps represent the bulk of the procedures that ELIZA follows in order to create a response from a
typical input, though there are several specialized situations that ELIZA/DOCTOR can respond to. One
Weizenbaum specifically wrote about was when there is no keyword. One solution was to have ELIZA
respond with a remark that lacked content, such as "I see" or "Please go on".[18] The second method was
to use a "MEMORY" structure, which recorded prior recent inputs, and would use these inputs to create a
response referencing a part of the earlier conversation when encountered with no keywords.[24] This was
possible due to Slip's ability to tag words for other usage, which simultaneously allowed ELIZA to
examine, store and repurpose words for usage in outputs.[18]

While these functions were all framed in ELIZA's programming, the exact manner by which the program
dismantled, examined, and reassembled inputs is determined by the operating script. The script is not static
and can be edited, or a new one created, as is necessary for the operation in the context needed. This would
allow the program to be applied in multiple situations, including the well-known DOCTOR script, which
simulates a Rogerian psychotherapist.[14]

A Lisp version of ELIZA, based on Weizenbaum's CACM paper, was written shortly after that paper's
publication, by Bernie Cosell.[25][26] A BASIC version appeared in Creative Computing in 1977 (although
it was written in 1973 by Jeff Shrager).[27] This version, which was ported to many of the earliest personal
computers, appears to have been subsequently translated into many other versions in many other languages.
Shrager claims not to have seen either Weizenbaum's or Cosell's versions.
In 2021 Jeff Shrager searched MIT's Weizenbaum archives, along with MIT archivist Myles Crowley, and
found files labeled Computer Conversations. These included the complete source code listing of ELIZA in
MAD-SLIP, with the DOCTOR script attached. The Weizenbaum estate has given permission to open-
source this code under a Creative Commons CC0 public domain license. The code and other information
can be found on the ELIZAGEN site.[26]

Another version of Eliza popular among software engineers is the version that comes with the default
release of GNU Emacs, and which can be accessed by typing M-x doctor from most modern Emacs
implementations.

Pseudocode

From Figure 15.5, Chapter 15 of Speech and Language Processing (third edition).[28]

function ELIZA GENERATOR(user sentence) returns response


Let w be the word in sentence that has the highest keyword rank
if w exists
Let r be the highest ranked rule for w that matches sentence
response ← Apply the transform in r to sentence
if w = 'my'
future ← Apply a transformation from the ‘memory’ rule list to sentence
Push future onto the memory queue
else (no keyword applies)
Either
response ← Apply the transform for the NONE keyword to sentence
Or
response ← Pop the oldest response from the memory queue
Return response

Response and legacy


Lay responses to ELIZA were disturbing to Weizenbaum and motivated him to write his book Computer
Power and Human Reason: From Judgment to Calculation, in which he explains the limits of computers,
as he wants to make clear his opinion that the anthropomorphic views of computers are just a reduction of
human beings or any life form for that matter.[29] In the independent documentary film Plug & Pray (2010)
Weizenbaum said that only people who misunderstood ELIZA called it a sensation.[30]

The Israeli poet David Avidan, who was fascinated with future technologies and their relation to art, desired
to explore the use of computers for writing literature. He conducted several conversations with an APL
implementation of ELIZA and published them – in English, and in his own translation to Hebrew – under
the title My Electronic Psychiatrist – Eight Authentic Talks with a Computer. In the foreword he presented
it as a form of constrained writing.[31]

There are many programs based on ELIZA in different programming languages. In 1980 a company called
"Don't Ask Software" created a version called "Abuse" for the Apple II, Atari, and Commodore 64
computers, which verbally abused the user based on the user's input.[32] For MS-DOS computers, some
Sound Blaster cards came bundled with Dr. Sbaitso, which functions like the DOCTOR script. Other
versions adapted ELIZA around a religious theme, such as ones featuring Jesus (both serious and comedic),
and another Apple II variant called I Am Buddha. The 1980 game The Prisoner incorporated ELIZA-style
interaction within its gameplay. In 1988 the British artist and friend of Weizenbaum Brian Reffin Smith
created two art-oriented ELIZA-style programs written in BASIC, one called "Critic" and the other
"Artist", running on two separate Amiga 1000 computers and showed them at the exhibition "Salamandre"
in the Musée du Berry, Bourges, France. The visitor was supposed to help them converse by typing in to
"Artist" what "Critic" said, and vice versa. The secret was that the two programs were identical. GNU
Emacs formerly had a psychoanalyze-pinhead command that simulates a session between ELIZA
and Zippy the Pinhead.[33] The Zippyisms were removed due to copyright issues, but the DOCTOR
program remains.

ELIZA has been referenced in popular culture and continues to be a source of inspiration for programmers
and developers focused on artificial intelligence. It was also featured in a 2012 exhibit at Harvard
University titled "Go Ask A.L.I.C.E.", as part of a celebration of mathematician Alan Turing's 100th
birthday. The exhibit explores Turing's lifelong fascination with the interaction between humans and
computers, pointing to ELIZA as one of the earliest realizations of Turing's ideas.[1]

ELIZA won a 2021 Legacy Peabody Award.

In popular culture
In 1969, George Lucas and Walter Murch incorporated an Eliza-like dialogue interface in their screenplay
for the feature film THX-1138. Inhabitants of the underground future world of THX, when stressed, would
retreat to "confession booths" and initiate a one-sided Eliza-formula conversation with a Jesus-faced
computer who claimed to be "OMM".

ELIZA influenced a number of early computer games by demonstrating additional kinds of interface
designs. Don Daglow wrote an enhanced version of the program called Ecala on a DEC PDP-10
minicomputer at Pomona College in 1973 before writing the computer role-playing game Dungeon (1975).

The 2011 video game Deus Ex: Human Revolution and the 2016 sequel Deus Ex: Mankind Divided
features an artificial-intelligence Picus TV Network newsreader named Eliza Cassan.[34]

In Adam Curtis's 2016 documentary, HyperNormalisation, ELIZA was referenced in relationship to post-
truth.[35]

In the twelfth episode of the American sitcom Young Sheldon, aired in January 2018, starred the protagonist
"conversing" with ELIZA, hoping to resolve a domestic issue.[36]

On August 12, 2019, independent game developer Zachtronics published a visual novel called Eliza, about
an AI-based counseling service inspired by ELIZA.[37][38]

See also
Linguistics portal

Psychology portal

ELIZA effect
ChatGPT

References
1. "Alan Turing at 100" (http://news.harvard.edu/gazette/story/2012/09/alan-turing-at-100/).
Harvard Gazette. 13 September 2012. Retrieved 2016-02-22.
2. Berry, David M. (2018). "Weizenbaum, ELIZA and the End of Human Reason". In
Baranovska, Marianna; Höltgen, Stefan (eds.). Hello, I'm Eliza: Fünfzig Jahre Gespräche mit
Computern [Hello, I'm Eliza: Fifty Years of Conversations with Computers] (in German)
(1st ed.). Berlin: Projekt Verlag. pp. 53–70. ISBN 9783897334670.
3. Weizenbaum, Joseph (1976). Computer Power and Human Reason: From Judgment to
Calculation. New York: W. H. Freeman and Company. ISBN 0-7167-0464-1.
4. Norvig, Peter (1992). Paradigms of Artificial Intelligence Programming. New York: Morgan
Kaufmann Publishers. pp. 151–154. ISBN 1-55860-191-0.
5. Weizenbaum, Joseph (January 1966). "ELIZA--A Computer Program for the Study of Natural
Language Communication Between Man and Machine" (http://www.universelle-automation.
de/1966_Boston.pdf) (PDF). Communications of the ACM. 9: 36–35.
doi:10.1145/365153.365168 (https://doi.org/10.1145%2F365153.365168). S2CID 1896290
(https://api.semanticscholar.org/CorpusID:1896290) – via universelle-automation.
6. Baranovska, Marianna; Höltgen, Stefan, eds. (2018). Hello, I'm Eliza fünfzig Jahre
Gespräche mit Computern (1st ed.). Bochum: Bochum Freiburg projektverlag. ISBN 978-3-
89733-467-0. OCLC 1080933718 (https://www.worldcat.org/oclc/1080933718).
7. "ELIZAGEN - The Original ELIZA" (https://sites.google.com/view/elizagen-org/the-original-el
iza). sites.google.com. Retrieved 2021-05-31.
8. Dillon, Sarah (2020-01-02). "The Eliza effect and its dangers: from demystification to gender
critique" (https://doi.org/10.1080/14797585.2020.1754642). Journal for Cultural Research.
24 (1): 1–15. doi:10.1080/14797585.2020.1754642 (https://doi.org/10.1080%2F14797585.2
020.1754642). ISSN 1479-7585 (https://www.worldcat.org/issn/1479-7585).
S2CID 219465727 (https://api.semanticscholar.org/CorpusID:219465727).
9. Bassett, Caroline (2019). "The computational therapeutic: exploring Weizenbaum's ELIZA
as a history of the present" (https://doi.org/10.1007%2Fs00146-018-0825-9). AI & Society.
34 (4): 803–812. doi:10.1007/s00146-018-0825-9 (https://doi.org/10.1007%2Fs00146-018-0
825-9).
10. "The Samantha Test" (https://www.newyorker.com/culture/culture-desk/the-samantha-test/am
pwebsite=newyorker.com). The New Yorker. Retrieved 2019-05-25.
11. Marino, Mark (2006). Chatbot: The Gender and Race Performativity of Conversational
Agents (https://www.proquest.com/openview/3c91805eb882d2a56d58aaa6f809fa50/).
University of California.
12. Colby, Kenneth Mark; Watt, James B.; Gilbert, John P. (1966). "A Computer Method of
Psychotherapy". The Journal of Nervous and Mental Disease. 142 (2): 148–52.
doi:10.1097/00005053-196602000-00005 (https://doi.org/10.1097%2F00005053-19660200
0-00005). PMID 5936301 (https://pubmed.ncbi.nlm.nih.gov/5936301). S2CID 36947398 (http
s://api.semanticscholar.org/CorpusID:36947398).
13. Shah, Huma; Warwick, Kevin; Vallverdú, Jordi; Wu, Defeng (2016). "Can machines talk?
Comparison of Eliza with modern dialogue systems" (https://curve.coventry.ac.uk/open/item
s/d4c5572d-3a8f-4ed1-b085-c88f8124fd74/1/Can+Machines+Talk_+CHB_Shah-Warwick_2
016+(1).pdf) (PDF). Computers in Human Behavior. 58: 278–95.
doi:10.1016/j.chb.2016.01.004 (https://doi.org/10.1016%2Fj.chb.2016.01.004).
14. Shrager, Jeff; Berry, David M.; Hay, Anthony; Millican, Peter (2022). "Finding ELIZA -
Rediscovering Weizenbaum's Source Code, Comments and Faksimiles". In Baranovska,
Marianna; Höltgen, Stefan (eds.). Hello, I'm Eliza: Fünfzig Jahre Gespräche mit Computern
(2nd ed.). Berlin: Projekt Verlag. pp. 247–248.
15. Weizenbaum 1976, p. 188.
16. Epstein, J.; Klinkenberg, W. D. (2001). "From Eliza to Internet: A brief history of computerized
assessment". Computers in Human Behavior. 17 (3): 295–314. doi:10.1016/S0747-
5632(01)00004-8 (https://doi.org/10.1016%2FS0747-5632%2801%2900004-8).
17. Wortzel, Adrianne (2007). "ELIZA REDUX: A Mutable Iteration". Leonardo. 40 (1): 31–6.
doi:10.1162/leon.2007.40.1.31 (https://doi.org/10.1162%2Fleon.2007.40.1.31).
JSTOR 20206337 (https://www.jstor.org/stable/20206337). S2CID 57565169 (https://api.sem
anticscholar.org/CorpusID:57565169).
18. Weizenbaum, Joseph (1966). "ELIZA—a computer program for the study of natural
language communication between man and machine". Communications of the ACM. 9: 36–
45. doi:10.1145/365153.365168 (https://doi.org/10.1145%2F365153.365168).
S2CID 1896290 (https://api.semanticscholar.org/CorpusID:1896290).
19. Wardrip-Fruin, Noah (2009). Expressive Processing: Digital Fictions, Computer Games, and
Software Studies. Cambridge, Massachusetts: MIT Press. p. 33. ISBN 9780262013437.
OCLC 827013290 (https://www.worldcat.org/oclc/827013290).
20. Markoff, John (2008-03-13), "Joseph Weizenbaum, Famed Programmer, Is Dead at 85" (http
s://www.nytimes.com/2008/03/13/world/europe/13weizenbaum.html), The New York Times,
retrieved 2009-01-07.
21. Weizenbaum, Joseph (1976). Computer power and human reason: from judgment to
calculation (https://archive.org/details/computerpowerhum0000weiz). W. H. Freeman. p. 7 (ht
tps://archive.org/details/computerpowerhum0000weiz/page/7).
22. Megan, Garber (Jun 9, 2014). "When PARRY Met ELIZA: A Ridiculous Chatbot
Conversation From 1972" (https://www.theatlantic.com/technology/archive/2014/06/when-pa
rry-met-eliza-a-ridiculous-chatbot-conversation-from-1972/372428/). The Atlantic. Archived
(https://web.archive.org/web/20170118165304/http://www.theatlantic.com/technology/archiv
e/2014/06/when-parry-met-eliza-a-ridiculous-chatbot-conversation-from-1972/372428/) from
the original on 2017-01-18. Retrieved 19 January 2017.
23. Walden, David; Van Vleck, Tom, eds. (2011). "Compatible Time-Sharing System (1961-
1973): Fiftieth Anniversary Commemorative Overview" (https://multicians.org/thvv/compatibl
e-time-sharing-system.pdf) (PDF). IEEE Computer Society. Retrieved February 20, 2022.
"Joe Wiezenbaum's most famous CTSS project was ELIZA"
24. Wardip-Fruin, Noah (2014). Expressive Processing: Digital Fictions, Computer Games, and
Software Studies. Cambridge: The MIT Press. p. 33. ISBN 9780262013437 – via eBook
Collection (EBSCOhost).
25. "Coders at Work: Bernie Cosell" (http://www.codersatwork.com/bernie-cosell.html).
codersatwork.com.
26. "elizagen.org" (http://elizagen.org/). elizagen.org.
27. Big Computer Games: Eliza – Your own psychotherapist (http://www.atariarchives.org/bigco
mputergames/showpage.php?page=20) at www.atariarchives.org.
28. "Chatbots & Dialogue Systems" (https://web.stanford.edu/~jurafsky/slp3/15.pdf) (PDF).
stanford.edu. Retrieved 6 April 2023.
29. Berry, David M. (2014). Critical theory and the digital (https://www.worldcat.org/oclc/8684889
16). London: Bloomsbury Publishing. ISBN 978-1-4411-1830-1. OCLC 868488916 (https://w
ww.worldcat.org/oclc/868488916).
30. maschafilm. "Content: Plug & Pray Film – Artificial Intelligence – Robots" (http://www.plugan
dpray-film.de/en/content.html). plugandpray-film.de.
31. Avidan, David (2010), Collected Poems, vol. 3, Jerusalem: Hakibbutz Hameuchad,
OCLC 804664009 (https://www.worldcat.org/oclc/804664009).
32. Davidson, Steve (January 1983). "Abuse" (http://www.atarimania.com/magazine_review.aw
p?id=86). Electronic Games. Vol. 1, no. 11..
33. "lol:> psychoanalyze-pinhead" (http://www.ibm.com/developerworks/ibm/library/lol/pinhead.
html). IBM.
34. Tassi, Paul. " 'Deus Ex: Mankind Divided's Ending Is Disappointing In A Different Way" (http
s://www.forbes.com/sites/insertcoin/2016/08/25/deus-ex-mankind-divideds-ending-is-disapp
ointing-in-a-different-way/). Forbes. Retrieved 2020-04-04.
35. "The Quietus | Opinion | Black Sky Thinking | HyperNormalisation: Is Adam Curtis, Like
Trump, Just A Master Manipulator?" (https://thequietus.com/articles/21077-adam-curtis-hyper
normalisation-review-bbc-politics-doom). The Quietus. Retrieved 26 June 2021.
36. McCarthy, Tyler (2018-01-18). "Young Sheldon Episode 12 recap: The family's first computer
almost tears it apart" (http://www.foxnews.com/entertainment/2018/01/18/young-sheldon-epi
sode-12-recap-familys-first-computer-almost-tears-it-apart.html). Fox News. Retrieved
2018-01-24.
37. O'Connor, Alice (2019-08-01). "The next Zachtronics game is Eliza, a visual novel about AI"
(https://www.rockpapershotgun.com/2019/08/01/the-next-zachtronics-game-is-eliza-a-visual-
novel-about-ai/). Rock Paper Shotgun. Retrieved 2019-08-01.
38. Machkovech, Sam (August 12, 2019). "Eliza review: Startup culture meets sci-fi in a
touching, fascinating tale" (https://arstechnica.com/gaming/2019/08/eliza-review-startup-cult
ure-meets-sci-fi-in-a-touching-fascinating-tale/). Ars Technica. Retrieved August 12, 2019.

Bibliography
Norvig, Peter (1992), ELIZA: Paradigms of Artificial Intelligence Programming, San
Francisco: Morgan Kaufmann Publishers, pp. 151–154, 159, 163–169, 175, 181, ISBN 1-
55860-191-0.
Wardip-Fruin, Noah (2014), Expressing Processing: Digital Fictions, Computer Games, and
Software Studies, Cumberland: MIT Press, pp. 24–36, ISBN 978-0262517539.
Weizenbaum, Joseph (1976), Computer power and human reason: from judgment to
calculation, W. H. Freeman and Company, ISBN 0-7167-0463-3.
Whitby, Blay (1996), "The Turing Test: AI's Biggest Blind Alley?", in Millican, Peter; Clark,
Andy (eds.), Machines and Thought: The Legacy of Alan Turing (https://web.archive.org/we
b/20080619033628/http://www.cogs.susx.ac.uk/users/blayw/tt.html), vol. 1, Oxford University
Press, pp. 53–62, ISBN 0-19-823876-2, archived from the original (http://www.cogs.susx.ac.
uk/users/blayw/tt.html) on 2008-06-19, retrieved 2008-08-11.

Further reading
McCorduck, Pamela (2004), Machines Who Think (2nd ed.), Natick, MA: A. K. Peters, Ltd.,
ISBN 1-56881-205-1

External links
Collection (https://github.com/jeffshrager/elizagen.org) of several source code versions at
GitHub
dialogues with colorful personalities of early AI (https://web.archive.org/web/2013012016183
9/http://www.stanford.edu/group/SHR/4-2/text/dialogues.html) at the Wayback Machine
(archived January 20, 2013), a collection of dialogues between ELIZA and various
conversants, such as a company vice president and PARRY (a simulation of a paranoid
schizophrenic)
Weizenbaum. Rebel at work (http://www.ilmarefilm.org/archive/weizenbaum_archiv_E.html)
– Peter Haas, Silvia Holzinger, Documentary film with Joseph Weizenbaum and ELIZA.
CORECURSIVE #078; The History and Mystery Of Eliza; With Jeff Shrager (https://corecursi
ve.com/eliza-with-jeff-shrager/) – Adam Gordon Bell interviews Jeff Shrager, author of the
1973/77 BASIC ELIZA, and discoverer of the original ELIZA code.

Retrieved from "https://en.wikipedia.org/w/index.php?title=ELIZA&oldid=1166639475"

You might also like