Thumbnail
Access Restriction
Open

Author Salah, Albert Ali
Source CiteSeerX
Content type Text
File Format PDF
Subject Domain (in DDC) Computer science, information & general works ♦ Data processing & computer science
Abstract Humans perceive the world through different perceptual modalities, which are processed in the brain by modality-specific areas and structures. However, there also exist multimodal neurons and areas, specialized in integrating perceptual information to enhance or suppress brain response. The particular way the human brain fuses crossmodal (or multimodal) perceptual information manifests itself first in behavioural studies. These crossmodal interactions are widely explored in some modalities, especially for auditory and visual input, and less explored for other modalities, like taste and olfaction, yet it is known that these effects can occur with any two modalities. The integration of sensory data is an important research area in computer science, and stands to benefit from the studies into the brain function; many biological processes serve as models for computer algorithms. On the other hand, computer models of sensor integration are built on mathematical principles, and provide normative insights into the functioning of the brain. This paper surveys the psychological and neurological findings pertaining to human multisensor
Educational Role Student ♦ Teacher
Age Range above 22 year
Educational Use Research
Education Level UG and PG ♦ Career/Technical Study
Learning Resource Type Article