Advertisement

Mixed Reality Stock Trading Visualization System

  • Dariusz Rumiński
  • Mikołaj Maik
  • Krzysztof Walczak
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10850)

Abstract

In this paper, we present a novel mixed reality system for supporting stock market trading. The system is designed to enhance traders’ working environment by displaying an array of virtual screens visualizing financial stock data and related news feeds within the user’s surroundings. We combined the nVisor ST50 headset with InteriaCube4 and Leap Motion devices to enable tracking of head orientation and controlling the VR/AR environment with hands. Users can create and control the virtual screens directly using their hands in 3D space.

Keywords

Mixed reality AR VR Visualization Stock data Natural interaction Leap Motion 

Notes

Acknowledgements

This research work has been supported by the Polish National Science Centre (NCN) Grant No. DEC-2016/20/T/ST6/00590.

References

  1. 1.
    Billinghurst, M., Kato, H.: Collaborative augmented reality. Commun. ACM 45(7), 64–70 (2002)CrossRefGoogle Scholar
  2. 2.
    Cellary, W., Wiza, W., Walczak, K.: Visualizing web search results in 3D. Computer 37(5), 87–89 (2004)CrossRefGoogle Scholar
  3. 3.
    Fernandes, B., Fernández, J.: Bare hand interaction in tabletop augmented reality. In: SIGGRAPH 2009: Posters, p. 98. ACM (2009)Google Scholar
  4. 4.
    Flotyński, J., Walczak, K.: Customization of 3D content with semantic meta-scenes. Graph. Models 88, 23–39 (2016). http://www.sciencedirect.com/science/article/pii/S1524070316300182MathSciNetCrossRefGoogle Scholar
  5. 5.
    Gupta, K., Lee, G.A., Billinghurst, M.: Do you see what i see? the effect of gaze tracking on task space remote collaboration. IEEE Trans. Visual. Comput. Graph. 22(11), 2413–2422 (2016)CrossRefGoogle Scholar
  6. 6.
    Heidemann, G., Bax, I., Bekel, H.: Multimodal interaction in an augmented reality scenario. In: Proceedings of the 6th International Conference on Multimodal Interfaces, pp. 53–60. ACM (2004)Google Scholar
  7. 7.
    Kaiser, E., Olwal, A., McGee, D., Benko, H., Corradini, A., Li, X., Cohen, P., Feiner, S.: Mutual disambiguation of 3D multimodal interaction in augmented and virtual reality. In: Proceedings of the 5th International Conference on Multimodal Interfaces, pp. 12–19. ACM (2003)Google Scholar
  8. 8.
    Khan, H., Lee, G.A., Hoermann, S., Clifford, R., Billinghurst, M., Lindeman, R.W.: Evaluating the Effects of Hand-gesture-based Interaction With Virtual Content in a 360\(^{\circ }\) Movie (2017)Google Scholar
  9. 9.
    Kolsch, M., Bane, R., Hollerer, T., Turk, M.: Multimodal interaction with a wearable augmented reality system. IEEE Comput. Graph. Appl. 26(3), 62–71 (2006)CrossRefGoogle Scholar
  10. 10.
    Piumsomboon, T., Altimira, D., Kim, H., Clark, A., Lee, G., Billinghurst, M.: Grasp-shell vs gesture-speech: a comparison of direct and indirect natural interaction techniques in augmented reality. In: IEEE International Symposium on Mixed and Augmented Reality (ISMAR) 2014, pp. 73–82. IEEE (2014)Google Scholar
  11. 11.
    Piumsomboon, T., Clark, A., Billinghurst, M., Cockburn, A.: User-defined gestures for augmented reality. In: Kotzé, P., Marsden, G., Lindgaard, G., Wesson, J., Winckler, M. (eds.) INTERACT 2013. LNCS, vol. 8118, pp. 282–299. Springer, Heidelberg (2013).  https://doi.org/10.1007/978-3-642-40480-1_18CrossRefGoogle Scholar
  12. 12.
    Rumiński, D.: An experimental study of spatial sound usefulness in searching and navigating through AR environments. Virtual Reality 19(3–4), 223–233 (2015)CrossRefGoogle Scholar
  13. 13.
    Rumiński, D., Walczak, K.: Semantic contextual augmented reality environments. In: The 13th IEEE International Symposium on Mixed and Augmented Reality (ISMAR 2014), pp. 401–404. IEEE (2014)Google Scholar
  14. 14.
    Rumiński, D., Walczak, K.: An architecture for semantic distributed augmented reality services. In: Proceedings of the 23rd International Conference on 3D Web Technology. Web3D 2018, ACM, New York, NY, USA (2018). http://doi.acm.org/10.1145/3208806.3208829
  15. 15.
    Sodnik, J., Tomazic, S., Grasset, R., Duenser, A., Billinghurst, M.: Spatial sound localization in an augmented reality environment. In: Proceedings of the 18th Australia Conference on Computer-human Interaction: Design: Activities, Artefacts and Environments, pp. 111–118. ACM (2006)Google Scholar
  16. 16.
    Song, P., Goh, W.B., Hutama, W., Fu, C.W., Liu, X.: A handle bar metaphor for virtual object manipulation with mid-air interaction. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1297–1306. ACM (2012)Google Scholar
  17. 17.
    Taehee Lee, T., Handy, A.: Markerless inspection of augmented reality objects using fingertip tracking. In: IEEE International Symposium on Wearable Computers (2007)Google Scholar
  18. 18.
    Valentini, P.P., Pezzuti, E.: Accuracy in fingertip tracking using leap motion controller for interactive virtual applications. Int. J. Interact. Des. Manuf. (IJIDeM) 11(3), 641–650 (2017)CrossRefGoogle Scholar
  19. 19.
    Walczak, K., Cellary, W.: X-VRML-XML based modeling of virtual reality. In: Proceedings 2002 Symposium on Applications and the Internet (SAINT 2002), pp. 204–211. IEEE (2002)Google Scholar
  20. 20.
    Walczak, K., Rumiński, D., Flotyński, J.: Building contextual augmented reality environments with semantics. In: Proceedings of the 20th International Conference on Virtual Systems & Multimedia VSMM 2014, Hong Kong, 9–12 December 2014, pp. 353–361. IEEE (2014)Google Scholar
  21. 21.
    Wang, R., Paris, S., Popović, J.: 6D hands: markerless hand-tracking for computer aided design. In: Proceedings of the 24th annual ACM Symposium on User Interface Software and Technology, pp. 549–558. ACM (2011)Google Scholar
  22. 22.
    Wu, C.M., Hsu, C.W., Lee, T.K., Smith, S.: A virtual reality keyboard with realistic haptic feedback in a fully immersive virtual environment. Virtual Reality 21(1), 19–29 (2017)CrossRefGoogle Scholar
  23. 23.
    Zhou, Z., Cheok, A.D., Yang, X., Qiu, Y.: An experimental study on the role of software synthesized 3D sound in augmented reality environments. Interact. Comput. 16(5), 989–1016 (2004)CrossRefGoogle Scholar
  24. 24.
    Zocco, A., Zocco, M.D., Greco, A., Livatino, S., De Paolis, L.T.: Touchless interaction for command and control in military operations. In: De Paolis, L.T., Mongelli, A. (eds.) AVR 2015. LNCS, vol. 9254, pp. 432–445. Springer, Cham (2015).  https://doi.org/10.1007/978-3-319-22888-4_32CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Dariusz Rumiński
    • 1
  • Mikołaj Maik
    • 1
  • Krzysztof Walczak
    • 1
  1. 1.Poznań University of Economics and BusinessPoznańPoland

Personalised recommendations