Augmented reality flavor: cross-modal mapping across gustation, olfaction, and vision
MetadataShow full item record
Gustatory display research is still in its infancy despite being one of the essential everyday senses that human practice while eating and drinking. Indeed, the most important and frequent tasks that our brain deals with every day are foraging and feeding. The recent studies by psychologists and cognitive neuroscientist revealed how complex multisensory rely on the integration of cues from all the human senses in any flavor experiences. The perception of flavor is multisensory and involves combinations of gustatory and olfactory stimuli. The cross-modal mapping between these modalities needs to be more explored in the virtual environment and simulation, especially in liquid food. In this paper, we present a customized wearable Augmented Reality (AR) system and olfaction display to study the effect of vision and olfaction on the gustatory sense. A user experiment and extensive analysis conducted to study the influence of each stimulus on the overall flavor, including other factors like age, previous experience in Virtual Reality (VR)/AR, and beverage consumption. The result showed that smell contributes strongly to the flavor with less contribution to the vision. However, the combination of these stimuli can deliver richer experience and a higher belief rate. Beverage consumption had a significant effect on the flavor belief rate. Experience is correlated with stimulus and age is correlated with belief rate, and both indirectly affected the belief rate. 2021, The Author(s).
- Computer Science & Engineering [1930 items ]