Gaze shift reflex in a humanoid active vision system

Authors

  • Ansgar R. Koene
  • Jan Morén
  • Vlad Trifa
  • Gordon Cheng

DOI:

https://doi.org/10.2390/biecoll-icvs2007-170

Keywords:

Vision, Audio, multi-modal, sensory integration, modeling, DDC: 004 (Data processing, computer science, computer systems)

Abstract

Full awareness of sensory surroundings requires active attentional and behavioural exploration. In visual animals, visual, auditory and tactile stimuli elicit gaze shifts (head and eye movements) aimed at optimising visual perception of stimuli. Such gaze shifts can either be top-down attention driven (e.g. visual search) or they can be reflex movements triggered by unexpected changes in the surroundings. Here we present a model active vision system with focus on multi-sensory integration and the generation of desired gaze shift commands. Our model is based on recent data from studies of primate superior colliculus and is developed as part of the sensory-motor control of the humanoid robot CB.

Downloads

Published

2007-12-31

Issue

Section

International Cognitive Vision Workshop - ICVW 2007