A Real-Time Architecture for Conversational Agents Public

Downloadable Content

open in viewer

Consider two people having a face-to-face conversation. They sometimes listen, sometimes talk, and sometimes interrupt each other. They use facial expressions to signal that they are confused. They point at objects. They jump from topic to topic opportunistically. When another acquaintance walks by, they nod and say hello. All the while they have other concerns on their mind, such as not missing the meeting that starts in 10 minutes. Like many other humans behaviors, these are not easy to replicate in artificial agents. In this work we look into the design requirements of an embodied agent that can participate in such natural conversations in a mixed-initiative, multi-modal setting. Such an agent needs to understand participating in a conversation is not merely a matter of sending a message and then waiting to receive a response -- both partners are simultaneously active at all times. This agent should be able to deal with different, sometimes conflicting goals, and be always ready to address events that may interrupt the current topic of conversation. To address those requirements, we have created a modular architecture that includes distributed functional units that compete with each other to gain control over available resources. Each of these units, called a schema, has its own sense- think-act cycle. In the field of robotics, this design is often referred to as ""behavior-based"" or ""schema-based."" The major contribution of this work is merging behavior-based robotics with plan- based human-computer interaction.

  • English
  • etd-082412-131304
Defense date
  • 2012
Date created
  • 2012-08-24
Resource type
Rights statement


In Collection:


Permanent link to this page: