Advertisement

Multimedia Tools and Applications

, Volume 76, Issue 9, pp 11407–11428 | Cite as

Development of simulator for invoked reality environmental design

  • Sohyun Sim
  • Seoungjae Cho
  • Wei Song
  • Simon Fong
  • Yong Woon Park
  • Kyungeun ChoEmail author
Article
  • 206 Downloads

Abstract

Invoked Reality (IR) research uses a variety of sensors and projection equipment for Natural User Interface (NUI) interaction. Such equipment requires an appropriate action region, which needs to be considered when implementing an IR environment. Accordingly, the implementation of an IR environment is problematic because of the arrangement of this equipment. The research presented in this paper overcomes this difficulty by proposing the “IR Simulator.” This simulator facilitates the design of and experimentation within the IR environment in a virtual environment. IR Simulator comprises an “experimental space editor” and “simulation manager.” IR Simulator enables design in diverse kinds of IR environments using its GUI-based experimental space editor. It also provides interaction with external IR libraries through the external IR library-interworking interface. Moreover, it enables simulation on the library connected on the basis of the virtual sensor data generated by the implementation of the proposed simulator. In accordance with experiments using the proposed simulator, a variety of IR environments could be generated and an arm motion recognition library could be simulated. Furthermore, the same environment generated by the simulator was easily implemented in the actual environment on the basis of the design results from the simulator. The experiment on the arm motion recognition library used in the simulator in the real environment was similar to the results obtained by the simulation.

Keywords

Invoked reality Natural user interface Virtual simulation 

Notes

Acknowledgments

This research was supported by the MSIP(Ministry of Science, ICT and Future Planning), Korea, under the ITRC(Information Technology Research Center) support program (IITP-2016-H8501-16-1014) supervised by the IITP(Institute for Information & communications Technology Promotion).

References

  1. 1.
    Armac I, Retkowitz D (2007) Simulation of smart environments. In: IEEE International Conference on Pervasive Services, pp. 257–266Google Scholar
  2. 2.
    Benko H, Jota R, Wilson A (2012) Mi-rageTable: freehand interaction on a projected augmented reality tabletop. In: Proceedings of the SIGCHI conference on human factors in computing systems, ACM, pp.199–208Google Scholar
  3. 3.
    Benko H, Wilson AD (2010) Multi-point interactions with immersive omnidirectional visu-alizations in a dome. In: ACM International Conference on Interactive Tabletops and Surfaces, ACM, pp.19–28Google Scholar
  4. 4.
    Blanco-Gonzalo R, Poh N, Wong R, Sanchez-Reillo R (2015) Time evolution of face recognition in accessible scenarios. Human-centric Comput Inf Sci 5(24):1–11Google Scholar
  5. 5.
    Cho S, Byun H, Lee H, Cha J (2012) Arm gesture recognition for shooting games based on kinect sensor. J KISS : Softw Appl 39(10):796–805Google Scholar
  6. 6.
    Gao Y, Lee HJ (2015) Viewpoint unconstrained face recognition based on affine local descriptors and probabilistic similarity. J Inf Process Syst 11(4)Google Scholar
  7. 7.
    Ghimire D, Lee J (2013) A robust face detection method based on skin color and edges. J Inf Process Syst 9(1):141–156CrossRefGoogle Scholar
  8. 8.
    Helal A, Mendez-Vazquez A, Hossain S (2009) Specification and synthesis of sensory datasets in pervasive spaces. In: IEEE Symposium on Computers and Communications, 2009. ISCC 2009, pp. 920–925Google Scholar
  9. 9.
    Jones BR, Benko H, Ofek E, Wilson AD (2013) IllumiRoom: peripheral projected illusions for interactive experiences. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, pp.869–878Google Scholar
  10. 10.
    Kim JK, Kang WM, Park JH, Kim JS (2014) GWD: Gesture-based wearable device for secure and effective information exchange on battlefield environment. J Convergence 5(4):6–10Google Scholar
  11. 11.
    Marner MR, Thomas BH (2013) Poster: spatial augmented reality user interface techniques for room size modeling tasks. In: IEEE Symposium on 3D User Interfaces (3DUI), pp. 155–156Google Scholar
  12. 12.
    Mistry P, Maes P (2009) SixthSense: a wearable gestural interface. In: ACM SIGGRAPH ASIA 2009 Sketches, ACM, p.11Google Scholar
  13. 13.
    Na M, You H, Kim T (2012) A vision-based real-time hand pose and gesture recognition method for smart device control. J Korean Inst Next Gener Comput 8(4):27–34Google Scholar
  14. 14.
    Van Krevelen DWF, Poelman R (2010) A survey of augmented reality technologies, applications and limitations. Int J Virtual Real 9(2):1–19Google Scholar
  15. 15.
    Wilson AD, Sarin R (2007) BlueTable: connecting wireless mobile devices on interactive surfaces using vision-based handshaking. In: Proceedings of Graphics interface 2007, ACM, pp.119–125Google Scholar
  16. 16.
    Wilson AD, Benko H (2010) Combining multiple depth cameras and projectors for interactions on, above and between surfaces. In: Proceedings of the 23nd annual ACM symposium on User interface software and technology, ACM, pp.273–282Google Scholar
  17. 17.
    Wilson A, Benko H, Izadi S, Hilliges O (2012) Steerable augmented reality with the beamatron. In: Proceedings of the 25th annual ACM symposium on User interface software and technology, ACM, pp.413–422Google Scholar
  18. 18.
    Xi Y, Kim Y, Sim S, Yi C, Lee B, Cho K, Um K (2014) Serious game based on motion recognition for learning improvement. In: The 6th FTRA International Conference on Computer Science and its Applications (CSA-14), p. 20Google Scholar
  19. 19.
    Xiao R, Harrison C, Hudson SE (2013) WorldKit: rapid and easy creation of ad-hoc interactive applications on everyday surfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, pp.879–888Google Scholar
  20. 20.
    Zerroug A, Cassinelli A, Ishikawa M (2011) In-voked computing: Spatial audio and video ar invoked through miming. In: Proceedings of Virtual Reality In-ternational Conference, pp.31–32Google Scholar

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  • Sohyun Sim
    • 1
  • Seoungjae Cho
    • 1
  • Wei Song
    • 2
  • Simon Fong
    • 3
  • Yong Woon Park
    • 4
  • Kyungeun Cho
    • 1
    Email author
  1. 1.Department of Multimedia EngineeringDongguk University-SeoulSeoulRepublic of Korea
  2. 2.Department of Digital Media Technology, College of Information EngineeringNorth China University of TechnologyBeijingChina
  3. 3.Department of Computer and Information ScienceUniversity of MacauMacauChina
  4. 4.Agency for Defense DevelopmentDaejeonRepublic of Korea

Personalised recommendations