2. Embodied conversational agents — history of development

Embodied Conversational Agents (ECAs) can be defined as “computer interfaces that can hold up their end of the conversation, interfaces that realize conversational behaviors as a function of the demands of dialogue and also as a function of emotion, personality, and social conversation.” [Cassell, Sullivan, et al., 2000]  In this section, I will overview the history of the development of  ECAs.
1. Natural Language Dialogue Systems
Early natural language dialogue systems include:

  • Lunar [Woods 1973]
  • Baseball [Green 1961]
  • ELIZA [Weizenbaum 1966]
  • SHRDLU [Winograd 1972]

2. Spoken Language Dialogue Systems
Then, we witnessed spoken language dialogue systems.

  • Hearsay-II [Erman 1980]
  • Put-that-there [Bolt 1980]

3. Concepts of Embodied Conversational Systems
Apple’s Knowledge Navigator concept video [Apple Computer 1987] gave significant impacts on how computers are used as a social agent.
CASA (Computers Are Aocial Actors)
Believability in interactive drama.  Implemented in the Oz project [Bates 1992].
From tool-based computing to assistive interface.  Addressed in the Persona project at Microsoft Research in late 1992.  Implemented as Peedy [Ball 1997].
Jennifer James is a consumer-friendly, intelligent, interactive 3D ex-NASCAR driver [Hayes-Roth 1998].
Rea is an embodied conversational agent whose verbal and nonverbal behaviors are designed in terms of conversational fashion [Cassell 1999]
PPT
is available from here (access limited) (updated at 12:10 October 12th)

References

  • [Apple Computer 1987] http://homepage.mac.com/ericestrada/Movies/iMovieTheater53.html
  • [Ball 1997] Gene Ball, Dan Ling, David Kurlander, John Miller, David Pugh, Tim Skelly, Andy Stankosky, David Thiel, Maarten Van Dantzich, and Trace Wax.  Lifelike Computer Characters: The Persona Project at Microsoft Research.  Software Agents. Jeffrey M. Bradshaw (ed.).  AAAI/MIT Press. Menlo Park, CA. 1997.
  • [Bates 1994] Joseph Bates. The role of emotion in believable agents. Communications of ACM, Vol. 37, No. 7, pp. 122-125, 1994.
  • [Bolt 1980] Richard A. Bolt: Put-that-there”: Voice and gesture at the graphics interface. In Proceedings of the 7th annual conference on Computer graphics and interactive techniques, Vol. 14, No. 3. (July 1980), pp. 262-270.
  • [Erman 1980] Lee D. Erman, Frederick Hayes-Roth, Victor R. Lesser, and D. Raj Reddy. The Hearsay-II Speech-Understanding System: Integrating Knowledge to Resolve Uncertainty, ACM Computing Surveys, Volume 12, Issue 2, Pages: 213 – 253, 1980.
  • [Cassell 1999] Cassell, J., Bickmore, T., Billinghurst, M., Campbell, L., Chang, K., Vilhjálmsson, H. and Yan, H.  (1999). “Embodiment in Conversational Interfaces: Rea.” Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI), pp. 520-527. Pittsburgh, PA.
  • [Green 1961] Bert F. Green, Jr., Alice K. Wolf, Carol Chomsky, and Kenneth Laughery: Baseball: An Automatic Question-Answerer Proceedings of the IRE-AIEE-ACM ’61 (Western); western joint IRE-AIEE-ACM computer conference, 1961.
  • [Hayes-Roth 1998] Barbara Hayes-Roth:  Jennifer James, celebrity auto spokesperson, International Conference on Computer Graphics and Interactive Techniques archive ACM SIGGRAPH 98 Conference abstracts and applications table of contents, P. 136, 1998.
  • [Weizenbaum 1966] Joseph Weizenbaum. ELIZA — A Computer Program for the study of natural language communication between man and machine, Communications of the ACM, Vol. 9, No. 1, pp. 36-45, 1966.
  • [Winograd 1972] Terry Winograd: Understanding Natural Language, Academic Press, 1972.
  • [Woods 1973] W. A. Woods: Progress in natural language understanding: an application to lunar geology, In Proceeding AFIPS ’73 Proceedings of the June 4-8, 1973, national computer conference and exposition.