日本語

 

Update(MM/DD/YYYY):04/03/2003

Brain knows when the sound comes up to us.

Highlights

  • Audio- and visual inputs are coordinated to perceive real-world objects and events.
  • Auditory inputs are received much later than visual inputs, because sound travels much slower than light. However, the delay is seldom noticed.
  • Audio and visual inputs are coordinated because the brain is actively changing the temporal location of the window for auditory integration depending on the distance from the visible sound source.


Outline

Fig.1
Fig. 1. The points of subjective equality (PSE, filled circles) were plotted against viewing distance. 25% (lower open circles) and 75% level (upper open circles) of light first response were also plotted to indicate the thresholds for detecting asynchrony. A dashed line represents sound arrival time in the real world.

In perceiving the sound produced by the movements of a visible object, the brain coordinates the auditory and visual input, so that no delay is noticed even though the sound arrives later (for distant source objects, such as aircraft or firework displays, this is less efficient). We found that coordination occurs because the brain uses information about distance that is supplied by the visual system to calibrate simultaneity.

A sound burst via headphones and a light flash were presented with different stimulus onset asynchrony. At greater distances from observers to the light, they could withstand a longer sound delay while still maintaining the impression of a common source object. These results indicate that real-world constraints, such as sound travels much slower than light, are implemented by audio-visual integration processes.

Of special interest was that the rate of the increase was roughly consistent with the velocity of the sound. It takes approximately 30ms for the sound to travel 10 meter through the air at sea level and normal temperature. Correspondingly, the maximum temporal separation increased by approximately 30ms with the 10 meter increase of the distance. This relationship held at least up to the distance of 20 meters.

Temporal proximity of visual and auditory stimuli is necessary for the audio-visual integration. The present study provides evidence that the maximum temporal separation of auditory and visual stimuli for integration is enlarged with the distance to a common source object. Hence, the brain can integrate audio-visual information for a wide range of temporal gaps and achieve correct matching of sound and visual sources among various distracting stimuli. This work was published in Nature 421, 911 (2003)






▲ ページトップへ