Interest estimation based on dynamic bayesian networks for visual attentive presentation agents

Boris Brandherm*, Helmut Prendinger, Mitsuru Ishizuka

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

12 Citations (Scopus)

Abstract

In this paper, we describe an interface consisting of a virtual showroom where a team of two highly realistic 3D agents presents product items in an entertaining and attractive way. The presentation flow adapts to users' attentiveness, or lack thereof, and may thus provide a more personalized and userattractive experience of the presentation. In order to infer users' attention and visual interest regarding interface objects, our system analyzes eye movements in real-time. Interest detection algorithms used in previous research determine an object of interest based on the time that eye gaze dwells on that object. However, this kind of algorithm is not well suited for dynamic presentations where the goal is to assess the user's focus of attention regarding a dynamically changing presentation. Here, the current context of the object of attention has to be considered, i.e., whether the visual object is part of (or contributes to) the current presentation content or not. Therefore, we propose a new approach that estimates the interest (or non-interest) of a user by means of dynamic Bayesian networks. Each of a predefined set of visual objects has a dynamic Bayesian network assigned to it, which calculates the current interest of the user in this object. The estimation takes into account (1) each new gaze point, (2) the current context of the object, and (3) preceding estimations of the object itself. Based on these estimations the presentation agents can provide timely and appropriate response.

Original languageEnglish
Title of host publicationProceedings of the 9th International Conference on Multimodal Interfaces, ICMI'07
Pages346-349
Number of pages4
DOIs
Publication statusPublished - 2007
Externally publishedYes
Event9th International Conference on Multimodal Interfaces, ICMI 2007 - Nagoya
Duration: 2007 Nov 122007 Nov 15

Other

Other9th International Conference on Multimodal Interfaces, ICMI 2007
CityNagoya
Period07/11/1207/11/15

Keywords

  • Human factors

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Graphics and Computer-Aided Design
  • Computer Vision and Pattern Recognition
  • Human-Computer Interaction

Fingerprint

Dive into the research topics of 'Interest estimation based on dynamic bayesian networks for visual attentive presentation agents'. Together they form a unique fingerprint.

Cite this