Multisensory integration of vision and touch in nonspatial feature discrimination tasks

Research output: Contribution to journalArticlepeer-review

3 Citations (Scopus)


Multisensory integration of nonspatial features between vision and touch was investigated by examining the effects of redundant signals of visual and tactile inputs. In the present experiments, visual letter stimuli and/or tactile letter stimuli were presented, which participants were asked to identify as quickly as possible. The results of Experiment 1 demonstrated faster reaction times for bimodal stimuli than for unimodal stimuli (the redundant signals effect (RSE)). The RSE was due to coactivation of figural representations from the visual and tactile modalities. This coactivation did not occur for a simple stimulus detection task (Experiment 2) or for bimodal stimuli with the same semantic information but different physical stimulus features (Experiment 3). The findings suggest that the integration process might occur at a relatively early stage of object-identification prior to the decision level.

Original languageEnglish
Pages (from-to)12-22
Number of pages11
JournalJapanese Psychological Research
Issue number1
Publication statusPublished - 2010 Mar


  • Nonspatial feature
  • Redundant signals effect
  • Visual-tactile integration


Dive into the research topics of 'Multisensory integration of vision and touch in nonspatial feature discrimination tasks'. Together they form a unique fingerprint.

Cite this