Abstract
A major concern in mixed-reality (MR) environments is to support intuitive and precise user interaction. Various modalities have been proposed and used, including gesture, gaze, voice, hand-recognition or even special devices, i.e. external controllers. However, these modalities may often feel unfamiliar and physically demanding to the end-user, leading to difficulties and fatigue. One possible solution worth investigating further is to use an everyday object, like a smartphone, as an external device for interacting with MR. In this paper, we present the design of a framework for developing an external smartphone controller to extend user input in MR applications, which we further utilize to implement a new interaction modality, a tap on the phone. We also report on findings of a user study (n=24) in which we examine performance and user experience of the suggested input modality through a comparative user evaluation task. The findings suggest that incorporating a smartphone as an external controller shows potential for enhancing user interaction in MR tasks requiring high precision, as well as pinpointing the value of providing alternative means of user input in MR applications depending on a given task and personalization aspects of an end-user.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Mirror networking. https://mirror-networking.gitbook.io/docs/
Aguinis, H., Gottfredson, R.K., Joo, H.: Best-practice recommendations for defining, identifying, and handling outliers. Organ. Res. Methods 16(2), 270–301 (2013)
Apple: Arkit. https://developer.apple.com/augmented-reality/arkit
Babic, T., Reiterer, H., Haller, M.: Pocket6: a 6dof controller based on a simple smartphone application. In: Proceedings of the 2018 ACM Symposium on Spatial User Interaction, pp. 2–10. SUI 2018, Association for Computing Machinery, New York, NY, USA (2018). https://doi.org/10.1145/3267782.3267785
Ban, Y., et al.: Augmented endurance: controlling fatigue while handling objects by affecting weight perception using augmented reality. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 69–78 (2013)
Bergé, L.P., Dubois, E., Raynal, M.: Design and evaluation of an “around the smartphone” technique for 3d manipulations on distant display. In: Proceedings of the 3rd ACM Symposium on Spatial User Interaction, pp. 69–78. SUI 2015, Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2788940.2788941
Brudy, F., et al.: Cross-device taxonomy: survey, opportunities and challenges of interactions spanning across multiple devices, pp. 1–28. CHI 2019, Association for Computing Machinery, New York, NY, USA (2019). https://doi.org/10.1145/3290605.3300792
Büschel, W., Mitschick, A., Meyer, T., Dachselt, R.: Investigating smartphone-based pan and zoom in 3d data spaces in augmented reality. In: Proceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services. MobileHCI 2019, Association for Computing Machinery, New York, NY, USA (2019). https://doi.org/10.1145/3338286.3340113
Google: Arcore. https://arvr.google.com/arcore
Hartmann, J., Gupta, A., Vogel, D.: Extend, push, pull: smartphone mediated interaction in spatial augmented reality via intuitive mode switching. In: Proceedings of the 2020 ACM Symposium on Spatial User Interaction. SUI 2020, Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3385959.3418456
Kubo, Y., Takada, R., Shizuki, B., Takahashi, S.: Exploring context-aware user interfaces for smartphone-smartwatch cross-device interaction. Wearabl. Ubiquit. Technol. 1(3), 1–21 (2017). https://doi.org/10.1145/3130934
Microsoft: Hololens 2 gestures for navigating a guide in dynamics 365 guides (2022). https://learn.microsoft.com/en-us/dynamics365/mixed-reality/guides/operator-gestures-hl2#gestures
Mohr, P., Tatzgern, M., Langlotz, T., Lang, A., Schmalstieg, D., Kalkofen, D.: Trackcap: enabling smartphones for 3d interaction on mobile head-mounted displays, pp. 1–11. CHI 2019, Association for Computing Machinery, New York, NY, USA (2019). https://doi.org/10.1145/3290605.3300815
NASA: Nasa task load index. https://hsi.arc.nasa.gov/groups/TLX/
Nath, A., Mukherjee, S.: Impact of mobile phone/smartphone: a pilot study on positive and negative effects. Int. J. Adv. Res. Comput. Sci. Manage. Stud. 3, 294–302 (2015)
Nováek, T., Jirina, M.: Overview of controllers of user interface for virtual reality. Presence Teleoper. Virtual. Environ. 29, 37–90 (2022). https://doi.org/10.1162/pres_a_00356
Paay, J., Raptis, D., Kjeldskov, J., Skov, M.B., Ruder, E.V., Lauridsen, B.M.: Investigating cross-device interaction between a handheld device and a large display. p. 6608–6619. CHI 2017, Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3025453.3025724
Speicher, M., Hall, B.D., Nebeling, M.: What is mixed reality? In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, pp. 1–15. CHI 2019, Association for Computing Machinery, New York, NY, USA (2019). https://doi.org/10.1145/3290605.3300767
Unity: Unity’s input system. https://docs.unity3d.com/Packages/com.unity.inputsystem@1.5/manual/index.html
Unlu, A.E., Xiao, R.: Pair: phone as an augmented immersive reality controller. In: Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology. VRST 2021, Association for Computing Machinery, New York, NY, USA (2021). https://doi.org/10.1145/3489849.3489878
Vinayak, Ramanujan, D., Piya, C., Ramani, K.: MobiSweep: exploring spatial design ideation using a smartphone as a hand-held reference plane. In: Proceedings of the TEI 2016: Tenth International Conference on Tangible, Embedded, and Embodied Interaction, pp. 12–20. TEI 2016, Association for Computing Machinery, New York, NY, USA (2016). https://doi.org/10.1145/2839462.2839490
Yan, Y., Yu, C., Yi, X., Shi, Y.: HeadGesture: hands-free input approach leveraging head movements for HMD devices. Proc. ACM Interact. Mob. Wearabl. Ubiquitous. Technol. 2(4), 1–23 (2018). https://doi.org/10.1145/3287076
Zhu, F., Grossman, T.: Bishare: exploring bidirectional interactions between smartphones and head-mounted augmented reality. In: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, pp. 1–14. CHI 2020, Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3313831.3376233
Acknowledgment
This work has been financially supported by the Hellenic Foundation for Research & Innovation (HFRI) under the 2nd Call for proposals for H.F.R.I. Research Projects to Support Faculty Members and Researchers, under the project entitled Electroencephalography and Eye Gaze driven Framework for Intelligent and Real-Time Human Cognitive Modelling (CogniX) with Proposal ID 3849.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
1 Electronic supplementary material
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Papadoulis, G., Sintoris, C., Fidas, C., Avouris, N. (2023). Extending User Interaction with Mixed Reality Through a Smartphone-Based Controller. In: Abdelnour Nocera, J., Kristín Lárusdóttir, M., Petrie, H., Piccinno, A., Winckler, M. (eds) Human-Computer Interaction – INTERACT 2023. INTERACT 2023. Lecture Notes in Computer Science, vol 14142. Springer, Cham. https://doi.org/10.1007/978-3-031-42280-5_27
Download citation
DOI: https://doi.org/10.1007/978-3-031-42280-5_27
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-42279-9
Online ISBN: 978-3-031-42280-5
eBook Packages: Computer ScienceComputer Science (R0)