Development of simulator for invoked reality environmental design

Abstract

Invoked Reality (IR) research uses a variety of sensors and projection equipment for Natural User Interface (NUI) interaction. Such equipment requires an appropriate action region, which needs to be considered when implementing an IR environment. Accordingly, the implementation of an IR environment is problematic because of the arrangement of this equipment. The research presented in this paper overcomes this difficulty by proposing the “IR Simulator.” This simulator facilitates the design of and experimentation within the IR environment in a virtual environment. IR Simulator comprises an “experimental space editor” and “simulation manager.” IR Simulator enables design in diverse kinds of IR environments using its GUI-based experimental space editor. It also provides interaction with external IR libraries through the external IR library-interworking interface. Moreover, it enables simulation on the library connected on the basis of the virtual sensor data generated by the implementation of the proposed simulator. In accordance with experiments using the proposed simulator, a variety of IR environments could be generated and an arm motion recognition library could be simulated. Furthermore, the same environment generated by the simulator was easily implemented in the actual environment on the basis of the design results from the simulator. The experiment on the arm motion recognition library used in the simulator in the real environment was similar to the results obtained by the simulation.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

References

  1. 1.

    Armac I, Retkowitz D (2007) Simulation of smart environments. In: IEEE International Conference on Pervasive Services, pp. 257–266

  2. 2.

    Benko H, Jota R, Wilson A (2012) Mi-rageTable: freehand interaction on a projected augmented reality tabletop. In: Proceedings of the SIGCHI conference on human factors in computing systems, ACM, pp.199–208

  3. 3.

    Benko H, Wilson AD (2010) Multi-point interactions with immersive omnidirectional visu-alizations in a dome. In: ACM International Conference on Interactive Tabletops and Surfaces, ACM, pp.19–28

  4. 4.

    Blanco-Gonzalo R, Poh N, Wong R, Sanchez-Reillo R (2015) Time evolution of face recognition in accessible scenarios. Human-centric Comput Inf Sci 5(24):1–11

    Google Scholar 

  5. 5.

    Cho S, Byun H, Lee H, Cha J (2012) Arm gesture recognition for shooting games based on kinect sensor. J KISS : Softw Appl 39(10):796–805

    Google Scholar 

  6. 6.

    Gao Y, Lee HJ (2015) Viewpoint unconstrained face recognition based on affine local descriptors and probabilistic similarity. J Inf Process Syst 11(4)

  7. 7.

    Ghimire D, Lee J (2013) A robust face detection method based on skin color and edges. J Inf Process Syst 9(1):141–156

    Article  Google Scholar 

  8. 8.

    Helal A, Mendez-Vazquez A, Hossain S (2009) Specification and synthesis of sensory datasets in pervasive spaces. In: IEEE Symposium on Computers and Communications, 2009. ISCC 2009, pp. 920–925

  9. 9.

    Jones BR, Benko H, Ofek E, Wilson AD (2013) IllumiRoom: peripheral projected illusions for interactive experiences. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, pp.869–878

  10. 10.

    Kim JK, Kang WM, Park JH, Kim JS (2014) GWD: Gesture-based wearable device for secure and effective information exchange on battlefield environment. J Convergence 5(4):6–10

    Google Scholar 

  11. 11.

    Marner MR, Thomas BH (2013) Poster: spatial augmented reality user interface techniques for room size modeling tasks. In: IEEE Symposium on 3D User Interfaces (3DUI), pp. 155–156

  12. 12.

    Mistry P, Maes P (2009) SixthSense: a wearable gestural interface. In: ACM SIGGRAPH ASIA 2009 Sketches, ACM, p.11

  13. 13.

    Na M, You H, Kim T (2012) A vision-based real-time hand pose and gesture recognition method for smart device control. J Korean Inst Next Gener Comput 8(4):27–34

    Google Scholar 

  14. 14.

    Van Krevelen DWF, Poelman R (2010) A survey of augmented reality technologies, applications and limitations. Int J Virtual Real 9(2):1–19

    Google Scholar 

  15. 15.

    Wilson AD, Sarin R (2007) BlueTable: connecting wireless mobile devices on interactive surfaces using vision-based handshaking. In: Proceedings of Graphics interface 2007, ACM, pp.119–125

  16. 16.

    Wilson AD, Benko H (2010) Combining multiple depth cameras and projectors for interactions on, above and between surfaces. In: Proceedings of the 23nd annual ACM symposium on User interface software and technology, ACM, pp.273–282

  17. 17.

    Wilson A, Benko H, Izadi S, Hilliges O (2012) Steerable augmented reality with the beamatron. In: Proceedings of the 25th annual ACM symposium on User interface software and technology, ACM, pp.413–422

  18. 18.

    Xi Y, Kim Y, Sim S, Yi C, Lee B, Cho K, Um K (2014) Serious game based on motion recognition for learning improvement. In: The 6th FTRA International Conference on Computer Science and its Applications (CSA-14), p. 20

  19. 19.

    Xiao R, Harrison C, Hudson SE (2013) WorldKit: rapid and easy creation of ad-hoc interactive applications on everyday surfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, pp.879–888

  20. 20.

    Zerroug A, Cassinelli A, Ishikawa M (2011) In-voked computing: Spatial audio and video ar invoked through miming. In: Proceedings of Virtual Reality In-ternational Conference, pp.31–32

Download references

Acknowledgments

This research was supported by the MSIP(Ministry of Science, ICT and Future Planning), Korea, under the ITRC(Information Technology Research Center) support program (IITP-2016-H8501-16-1014) supervised by the IITP(Institute for Information & communications Technology Promotion).

Author information

Affiliations

Authors

Corresponding author

Correspondence to Kyungeun Cho.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Sim, S., Cho, S., Song, W. et al. Development of simulator for invoked reality environmental design. Multimed Tools Appl 76, 11407–11428 (2017). https://doi.org/10.1007/s11042-016-4116-5

Download citation

Keywords

  • Invoked reality
  • Natural user interface
  • Virtual simulation