Saliency-Driven Tactile Effect Authoring for Real-Time Visuotactile Feedback

  • Myongchan Kim
  • Sungkil Lee
  • Seungmoon Choi
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7282)

Abstract

New-generation media such as the 4D film have appeared lately to deliver immersive physical experiences, yet the authoring has relied on content artists, impeding the popularization of such media. An automated approach for the authoring becomes increasingly crucial in lowering production costs and saving user interruption. This paper presents a fully automated framework of authoring tactile effects from existing video images to render synchronized visuotactile stimuli in real time. The spatiotemporal features of video images are analyzed in terms of visual saliency and translated into tactile cues that are rendered on tactors installed on a chair. A user study was conducted to evaluate the usability of visuotactile rendering against visual-only presentation. The result indicated that the visuotactile rendering can improve the movie to be more interesting, immersive, appealing, and understandable.

Keywords

Tactile Effect Visual Saliency 4D Film Multimedia 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Myongchan Kim
    • 1
  • Sungkil Lee
    • 2
  • Seungmoon Choi
    • 1
  1. 1.Pohang University of Science and TechnologyKorea
  2. 2.Sungkyunkwan UniversityKorea

Personalised recommendations