How do users generate an illusion of presence in a rich and consistent virtual environment from an impoverished, incomplete, and often inconsistent set of sensory cues? We conducted an experiment to explore how multimodal perceptual cues are integrated into a coherent experience of virtual objects and spaces. Specifically, we explored whether intermodal integration contributes to generating the illusion of presence in virtual environments. To discover whether intermodal integration might play a role in presence, we looked for evidence of intermodal integration in the form of cross-modal interactions - perceptual illusions in which users use sensory cues in one modality to "fill in" the "missing" components of perceptual experience. One form of cross-modal interaction, a cross-modal transfer, is defined as a form of synesthesia, that is, a perceptual illusion in which stimulation to a sensory modality connected to the interface (such as the visual modality) is accompanied by perceived stimulation to an unconnected sensory modality that receives no apparent stimulation from the virtual environment (such as the haptic modality). Users of our experimental virtual environment who manipulated the visual analog of a physical force, a virtual spring, reported haptic sensations of "physical resistance," even though the interface included no haptic displays. A path model of the data suggested that this cross-modal illusion was correlated with and dependent upon the sensation of spatial and sensory presence. We conclude that this is evidence that presence may derive from the process of multimodal integration and, therefore, may be associated with other illusions, such as cross-modal transfers, that result from the process of creating a coherent mental model of the space. Finally, we suggest that this perceptual phenomenon might be used to improve user experiences with multimodal interfaces, specifically by supporting limited sensory displays (such as haptic displays) with appropriate synesthetic stimulation to other sensory modalities (such as visual and auditory analogs of haptic forces).
ASJC Scopus subject areas
- Control and Systems Engineering
- Human-Computer Interaction
- Computer Vision and Pattern Recognition