8th Annual Meeting of the International Multisensory Research Forum
    Home > Papers > Heinrich Bülthoff
Heinrich Bülthoff

Multisensory Integration in Virtual Environments
Single Paper Presentation

Heinrich Bülthoff
Max Planck Institute for Biological Cybernetics

     Abstract ID Number: 129
     Full text: Not available
     Last modified: June 14, 2007
     Presentation date: 07/06/2007 4:30 PM in Quad General Lecture Theatre
     (View Schedule)

Many experiments which study the mechanisms by which different senses interact in humans focus on perception. In most natural tasks, however, sensory signals are not ultimately used for perception, but rather for action. The effects of the action are sensed again by the sensory system, so that perception and action are complementary parts of a dynamic control system. In our cybernetics research group at the Max Planck Institute in Tübingen, we use psychophysical, physiological, modeling and simulation techniques to study how cues from different sensory modalities are integrated by the brain to perceive and act in the real world. In psychophysical studies, we could show that humans can integrate multimodal sensory information in a statistically optimal way, such that cues are weighted according to their reliability. A better understanding of multimodal sensory fusion will allow us to build new virtual reality platforms in which the design effort for simulating the relevant modalities (visual, auditory, haptic, vestibular and proprioceptive) is influenced by the weight of each. In this talk we will discuss which of these characteristics would be necessary to allow valuable improvements in high-fidelity simulator design.

Support Tool
  For this 
invited conference abstract
Capture Cite
View Metadata
Printer Friendly
Author Bio
Define Terms
Related Studies
Media Reports
Google Search
Email Author
Email Others
Add to Portfolio

    Learn more
    about this

Public Knowledge

Open Access Research
home | overview | program
papers | organization | schedule | links