Download PDFOpen PDF in browser

Multi-Sensory Consistency Experience: a 6DOF Motion System Based on Video Automatically Generated Motion Simulation

EasyChair Preprint no. 10813

14 pagesDate: August 31, 2023

Abstract

As we perceive our surroundings in the real world through the integration of multiple senses, the experience of perceiving consistency through multiple senses in a virtual environment may enhance our presence. In this paper, we present a multi-sensory perception consistent 6DOF motion system. The system automatically extracts the motion trajectory of the virtual camera as motion data from video and maps the motion data to the 6DOF Stewart motion platform through a human perception-based wash-out algorithm and incorporates multi-sensory simulations of visual, auditory, tactile, and proprioceptive sensory perceptual consistency of the motion effect. The results of the user study showed that the system effectively enhanced the participants’ sense of realism and reduced the subjective perception of simulator discomfort. In addition, the system well supported users to self-create motion virtual environment through video, so that the public became the designer of motion experience content in the metaverse.

Keyphrases: motion platform, Multi-sensory consistency, Simulation experience, Virtual Reality

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@Booklet{EasyChair:10813,
  author = {Hongqiu Luan and Yu Wang and Li Huang and Lutong Wang and Gaorong Lv and Wei Gai and Xiaona Luan and Chenglei Yang},
  title = {Multi-Sensory Consistency Experience: a 6DOF Motion System Based on Video Automatically Generated Motion Simulation},
  howpublished = {EasyChair Preprint no. 10813},

  year = {EasyChair, 2023}}
Download PDFOpen PDF in browser