Abstract
Despite the spread of augmented reality (AR) systems and its applications onto a number of various areas, the adoption of AR in industrial context is relatively limited. We decided to conduct an exploratory user study to define the eventual singularities that might be associated with the barriers for HMD AR technology adoption in the industrial settings, as recent works presented potential benefits of its applications with regard to specific 3D measurement data interpretation. The task-based study was designed to engage users with interaction of volumetric data of static and time series nature. We compared actions of users performed in lab vs. in situ conditions simulating real, process tomography measurement data visualisations for granular bulk solids flow in large containers. Study results revealed concrete directions for further work that might eventually enable wider adoption of HMD AR systems in the industrial context in terms of specific gestural interaction and visualisation techniques development.
You have full access to this open access chapter, Download conference paper PDF
Keywords
1 Introduction
Technological progress significantly influences the development of systems supporting access to information and data visualization. Increasing the access paths to the sense of sight, and changing perception of the real environment, by providing additional information generated by computer systems against the background of the natural world, is crucial for applying augmented reality (AR) in different domains of life [1,2,3]. The study of the ways of communicating and interacting with augmented reality elements is an essential subject of research undertaken on many levels and fields of application of AR. The use of this technology is changing the way we work. AR fulfills the function of supporting activities performed physically by providing the user with a visual perception beyond the real world [4, 5]. AR allows you to enrich the real world with additional content, such as video or image, which allows the user to perform activities simultaneously in the real and digital world. Special attention is given to the use of AR in industrial applications [6].
There is a potential to further enrich the experience and usability of the AR Head-mounted devices (HMD AR) in the industrial context with gestural interaction and input. This would be especially helpful for monitoring and maintaining the reach of ongoing processes in measurement data, where HMD AR brings the extra value of freeing hands from the need to constantly hold the devices, a requirement for smaller AR-enabled equipment such as tablets. Gesture interaction with hands-free devices, like touch screen surfaces, is widely explored in literature [7]. However, as the understanding how the context of use influences gesture interaction with 3D data is still challenging, we designed an experiment conducted in both lab conditions as well as in situ with real industrial flow rig settings. This exploratory study investigates how users would interact with the measurement data of a 3D nature with the aid of HoloLens HMD in different settings.
2 Related Work
The interaction issue with AR systems has been well investigated. From this point of view, we can analyse users in terms of interaction with objects/data visible in augmented reality. Most researchers focus only on visualizing objects/data and developing better methods for gesture recognition. However, interaction with the augmented reality world also includes interaction with AR objects or datasets and directly influencing their form to analyse information hidden in objects or datasets.
2.1 Gestural Interaction and Data Visualization in AR Systems
The comparison of interaction in AR systems, between hand gesture-based interaction and multi-touch interaction, in terms of visual contexts, shows the advantage of hand gestures. [8]. Hand gesture interaction is faster than multi-touch interaction in regard to task completion time. There have been studies conducted to determine which gestures are the most intuitive for users [9]. Performing movements such as scaling, moving, deleting and approving are often used during user studies. However, the common approach is to show directly what the user should achieve and then they must make a move by which they want to achieve a specific goal. Additionally, gestural interaction is investigated on a general, universal object without special purposes; what causes tasks to be dedicated to object manipulation without a wider, determined aim. Such gesture interaction can cause a lack of understanding of the full interaction of users with the problem posed to solve and limit natural user interactions with AR data. Another study worth noting is the research on stock exchange data visualization and its use in AR [10]. The 3D representation of financial data with hand gesture interaction was only evaluated in the possibility of data analysis regarding limited time and fulfilling tasks.
An interesting gesture study involved the manipulation of different scale objects, rotating a house, and rearranging its rooms [11]. Authors explored how the scale of AR affects the gestures people expect to use to interact with 3D holograms. It was shown that one or two hands gestures were applied depending on the manipulated object size. In the case of large objects, the participants used both hands and, in the case of small objects, they did it with two fingers. The tasks were not complex and consisted of a sequence of separate gestures. The objects and work with objects were not analysed, only gestures.
2.2 Augmented Reality in an Industrial Setting
The direct application of the AR system in industry is widely researched, yet seldom implemented. The possibility of using popular modern AR systems based on mobile devices such as smartphones/tablets and smart glasses (Apple ARKit, Google ARCore, and Microsoft HoloLens) in an industrial context was investigated in terms of localisation quality in a large industry area [12]. The impact of using AR systems during device assembly instead of using a paper manual is also widely examined [13,14,15]. The most important limitation to be noted is the effort involved in creating a manual in AR compared to a paper one and considering the possibility of a serious mistake. Augmented Reality has also been tested in monitoring industrial flow processes. It has found application in drug diagnosis and simulations [16] or in-situ analysis and monitoring of measurement data analysis and monitoring [17]. As shown [18], most AR applications in the industry involve assembly processes by providing instructions to users on how to perform scheduled activities. These include remote assistance, improved user safety in industry space, or industrial process inspection & monitoring on site making. Some of the technologies involved relate to different sizes of displays (primarily tablets), projected AR views, and HMD use.
3 Experimental Study Description
The main goal of the study is to reveal what kind of gestures participants will use when conducting 3D data analysis in augmented reality and if there will be any differences depending on the context by recruited (n = 20) participants. The prototype AR app was based on 3D data model visualisation supporting baseline performance and enabling basic manipulations helpful for fundamental tasks performed with the data in normal conditions [19]. The chosen datasets were the electrical capacitance tomography (ECT) type [20, 21] for the gravitational flow in the silo-discharging process. Figure 1 presents both in-lab, in situ experimental space as well as types of projected AR visuals that can be treated as two types of interface alignments [22]. The observations and interviews were conducted during 20 experiment sessions. 15 sessions were conducted within lab conditions -- an empty classroom space. The remaining 5 took place in situ, at the semi-industrial tomography flow measurement lab. Each session started with a brief introduction to the tomography system and image interpretation in the context of the process.
The participants were required to perform 4 tasks (as described in Table 1.), with no time constraints. They were encouraged to speak out loud about what they wanted to do and how, which allowed researchers to gather more data by making notes on their comments. Afterwards, semi-structured interviews were recorded, and the observations were archived.
4 Results Overview
The results of the 4 conducted tasks lead us to indicate 4 main aspects of interactions: (i) location and position of user relative to the projected object, (ii) projected object displacements, (iii) object rotation and (iv) slicing & extracting sub-elements to get deeper insight into the projected visuals (as illustrated on Fig. 2).
Each of the identified gesture and interaction groups was analysed further to look for patterns throughout the study session and form conclusions and recommendations:
-
(i)
localisation and positioning: obtained results have shown that participants were less eager to move around the cylinder rather than just looking at the cylinder. Occasionally, they tried to move closer to the visualization, yet no significant differences between lab and in situ industrial conditions were observed.
-
(ii)
displacement: Most of visual objects displacement moves were one-hand driven, except when in real in situ conditions where users tend to use both hands for object grabbing and manipulation. Notably, we identified individual cases where participants utilized index finger pointing gestures for moving the object to the desired destination.
-
(iii)
rotation: Most of the participants performed this task with the implemented gesture of rotation by grabbing the object with two hands, yet a considerable portion of users tried to rotate the object with one hand as well.
-
(iv)
slicing & extracting: extracting single images from the stack of images (3D data) was observed in more complex tasks (T2 and T3). Again, for in situ conditions users tend to use both hands while generally one-hand interaction was preferred by the participants. Notably, slicing was the most diverse action in regard to preferred gestures proposed by the users. While single finger gestures were noticed for empty lab condition, no single finger gestures were observed in situ.
Figure 3 shows the mean rating of each category evaluated by users and the mean weight each category has, with the width of the bar indicating the weight [23]. Based on the graph we note that on average the physical and temporal demand were not the highest rated, we also can note that performance is the most highly rated and weighted category, which indicates that most users were preoccupied by their performance during the tasks. The frustration is also very highly weighted, which indicates that for a lot of users the frustration was important. Overall, mental demand is not rated highly, which is promising for implementation of the technology, as it shows that the complexity of the tasks did not increase because of HoloLens, although effort is one of the two highest rated categories.
5 Discussion and Conclusions
This exploratory study demonstrated patterns of possible use of HMD AR technology for the specific, industrial flow inspection application. Initial results revealed that behavior of a group of users while interacting gesturally with a virtual 3D data visual might be different for safe, lab conditions than for real industrial settings. In open space situations, where there are little to no potential risks while using the application, users tend to focus more on optimal solutions like operating with one hand. Analysis of data connected with object movement gestures shows that when creating an AR application for an industrial environment, it is important to implement grabbing functionality for both 1 and 2 hands.
Some minor problems revealed in the study might have been associated with the following factors such as: big-scale silo and its projections size coupled with limited HoloLens display area; habits of using everyday touchscreen gestures designed for flat, tangible surfaces that are not fully transformable to in-the-air space; etc. In the context of designing for HoloLens 2, the focus should be on improving the comfort of work. Some users have reported that wearing goggles is uncomfortable. They had them on their heads for about 10–15 min. Long work in the goggles may be problematic for users due to the uncomfortable mounting on the head.
The most frustrating problem for the user was the absence of recognition of the gestures they wanted to use. It made subsequent attempts more nervous. Despite a few shortcomings, users were enthusiastic about AR and how to use it. This points to the fact that along with the popularization of Augmented Reality in everyday life, they will show a desire to immerse themselves in it. Main conclusions derived from this study are as follows:
-
Significant difference between the gestural interactions performed by the participants within a neutral and an industrial environment was spotted in.
-
There are physical and technical limitations of HoloLens HMDs. Hence the context of working within the industrial settings must be considered when designing a particular solution.
-
There might be some benefits of simultaneous implementation of several different types of gestures for the same operation/command/task to accommodate the needs of different users.
Results suggest further exploratory research on this topic is recommended. Revealed patterns show we can highlight a need to mix single- and two-hand- gestures while building applications for industrial use. Furthermore, it was noticed that some users treat working with a 3D dataset as working with a physical object, while others treat it as a flat touchscreen-alike visual. Some differences were observed when comparing the users’ behavior in a semi-industrial settings and in an empty room to be further investigated in the future. All the examined cases suggest the need to consider both the surroundings and the context, when designing augmented reality applications for industrial settings. Furthermore, it would be interesting to explore possible combinations of the gestural interaction with some other sensing technologies, such as EMG [24] or ultrasound-based [25], to involve some machine learning algorithms [26, 27] for optimising the mixture of gestural, voice and traditional input [28] as well as further explore eye-tracking modality to track attention and performance of the users [29,30,31].
References
Zhang, Y., Nowak, A., Romanowski, A., Fjeld, M.: On-site or remote working?: an initial solution on how COVID-19 pandemic may impact augmented reality users. In: 2022 International Conference on Advanced Visual Interfaces (AVI 2022), Article no. 65, p. 3. ACM, New York (2022). https://doi.org/10.1145/3531073.3534490
Gerup, J., Soerensen, C.B., Dieckmann, P.: Augmented reality and mixed reality for healthcare education beyond surgery: an integrative review. Int. J. Med. Educ. 18(11), 1–18 (2020). https://doi.org/10.5116/ijme.5e01.eb1a
Juanes, J.A., Hernández, D., Ruisoto, P., García, E., Villarrubia, G., Prats, A.: Augmented reality techniques, using mobile devices, for learning human anatomy. In: Conference on Technological Ecosystems for Enhancing Multiculturality (TEEM 2014), pp. 7–11. ACM, New York (2014). https://doi.org/10.1145/2669711.2669870
Dubois, E., Nigay, L.: Augmented reality: which augmentation for which reality?. In: Designing Augmented Reality Environments (DARE 2000), pp. 165–166. ACM, New York (2000). https://doi.org/10.1145/354666.354695
Aromaa, S., Aaltonen, I., Kaasinen, E., Elo, J., Parkkinen, I.: Use of wearable and augmented reality technologies in industrial maintenance work. In: Proceedings of the 20th International Academic Mindtrek Conference (AcademicMindtrek 2016), pp. 235–242. ACM, New York (2016). https://doi.org/10.1145/2994310.2994321
Gattullo, M., Evangelista, A., Uva, A.E., Fiorentino, M., Boccaccio, A., Manghisi, V.M.: Exploiting augmented reality to enhance piping and instrumentation diagrams for information retrieval tasks in industry 4.0 maintenance. In: Bourdot, P., Interrante, V., Nedel, L., Magnenat-Thalmann, N., Zachmann, G. (eds.) EuroVR 2019. LNCS, vol. 11883, pp. 170–180. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-31908-3_11
Wobbrock, J.O., Ringel Morris, M., Wilson, A.D.: User-defined gestures for surface computing. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2009, Boston, MA, USA, pp. 1083–1092. ACM, New York (2009). https://doi.org/10.1145/1518701.1518866
Minseok, K., Sung, H., Kyeong-Beom, P., Jae Yeol, L.: User interactions for augmented reality smart glasses: a comparative evaluation of visual contexts and interaction gestures (2019)
Piumsomboon, T., Clark, A., Billinghurst, M., Cockburn, A.: User-defined gestures for augmented reality. In: Kotzé, P., Marsden, G., Lindgaard, G., Wesson, J., Winckler, M. (eds.) INTERACT 2013. LNCS, vol. 8118, pp. 282–299. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-40480-1_18
Rumiński, D., Maik, M., Walczak, K.: Visualizing financial stock data within an augmented reality trading environment. Acta Polytechnica Hungarica 16(6), 223–239 (2019)
Pham, T., Vermeulen, J., Tang, A., MacDonald Vermeulen, L.: Scale impacts elicited gestures for manipulating holograms: implications for AR gesture design. In: Proceedings of the 2018 Designing Interactive Systems Conference (DIS 2018), pp. 227–240. ACM, New York (2018). https://doi.org/10.1145/3196709.3196719
Feigl, T., Porada, A., Steiner, S., Löffler, C., Mutschler, C., Philippsen, M.: Localization limitations of ARCore, ARKit, and Hololens in dynamic large-scale industry environments. In: Proceedings of the 15th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - GRAPP, pp. 307–318 (2020). https://doi.org/10.5220/0008989903070318
Redžepagić, A., Löffler, C., Feigl, T., Mutschler, C.: A sense of quality for augmented reality assisted process guidance. In: 2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR), pp. 129–134 (2020). https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00046
Büttner, S., Prilla, M., Röcker, C.: Augmented reality training for industrial assembly work - are projection-based AR assistive systems an appropriate tool for assembly training? In: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI 2020), pp. 1–12. ACM, New York (2020). https://doi.org/10.1145/3313831.3376720
Hebenstreit, M., Spitzer, M., Eder, M., Ramsauer, C.: An industry 4.0 production workplace enhanced by using mixed reality assembly instructions with Microsoft HoloLens. In: Hansen, C., Nürnberger, A., Preim, B. (Hrsg.) Mensch und Computer 2020 - Workshopband., Gesellschaft für Informatik e.V. (2020). https://doi.org/10.18420/muc2020-ws116-005
Meyer, S.: Augmented reality in the pharmaceutical industry a case study on HoloLens for fully automated dissolution guidance (2021)
Nowak, A., Zhang, Y., Romanowski, A., Fjeld, M.: Augmented reality with industrial process tomography: to support complex data analysis in 3D space. In: 2021 ACM International Symposium on Wearable Computers (UbiComp 2021), pp. 56–58. ACM, New York (2021). https://doi.org/10.1145/3460418.3479288
Fernando de Souza Cardoso, L., Martins Queiroz Mariano, F.C., Zorzal, E.R.: A survey of industrial augmented reality. Comput. Ind. Eng. 139, 106159 (2020). https://doi.org/10.1016/j.cie.2019.106159. ISSN 0360-8352
Sulikowski, P., Zdziebko, T.: Deep learning-enhanced framework for performance evaluation of a recommending interface with varied recommendation position and intensity based on eye-tracking equipment data processing. Electronics 9(2), 266 (2020). https://doi.org/10.3390/electronics9020266
Hampel, U., et al.: A review on fast tomographic imaging techniques and their potential application in industrial process control. Sensors 22(6) (2022). https://doi.org/10.3390/s22062309
Rymarczyk, T., Kłosowski, G., Kozłowski, E., Tchórzewski, P.: Comparison of selected machine learning algorithms for industrial electrical tomography. Sensors 19(7) (2019). https://doi.org/10.3390/s19071521
Sulikowski, P., Zdziebko, T.: Horizontal vs. vertical recommendation zones evaluation using behavior tracking. Appl. Sci. 11(1), 56 (2021). https://doi.org/10.3390/app11010056
Hertzum, M.: Reference values and subscale patterns for the task load index (TLX): a meta-analytic review. Ergonomics 64, 869–878 (2021). https://doi.org/10.1080/00140139.2021.1876927
Woźniak, M., Pomykalski, P., Sielski, D., Grudzień, K., Paluch, N., Chaniecki, Z.: Exploring EMG gesture recognition-interactive armband for audio playback control. In: 2018 Federated Conference on Computer Science and Information Systems, pp. 919–923 (2018)
Soleimani, M., Rymarczyk, T.: A tactile skin system for touch sensing with ultrasound tomography. TechRxiv. Preprint (2022). https://doi.org/10.36227/techrxiv.21332655.v1
Rymarczyk, T., Król, K., Kozłowski, E., Wołowiec, T., Cholewa-Wiktor, M., Bednarczuk, P.: Application of electrical tomography imaging using machine learning methods for the monitoring of flood embankments leaks. Energies 14, 8081 (2021). https://doi.org/10.3390/en14238081
Romanowski, A., et al.: Interactive timeline approach for contextual spatio-temporal ECT data investigation. Sensors 20, 4793 (2020). https://doi.org/10.3390/s20174793
Pomykalski, P., Woźniak, M.P., Woźniak, P.W., Grudzień, K., Zhao, S., Romanowski, A.: Considering wake gestures for smart assistant use. In: Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems (CHI EA 2020), pp. 1–8. ACM, New York (2020). https://doi.org/10.1145/3334480.3383089
Sulikowski, P., Ryczko, K., Bąk, I., Yoo, S., Zdziebko, T.: Attempts to attract eyesight in e-commerce may have negative effects. Sensors 22, 8597 (2022). https://doi.org/10.3390/s22228597
Sulikowski, P., Kucznerowicz, M., Bąk, I., Romanowski, A., Zdziebko, T.: Online store aesthetics impact efficacy of product recommendations and highlighting. Sensors 22, 9186 (2022). https://doi.org/10.3390/s22239186
Schrader, A., et al.: Toward eye-tracked sideline concussion assessment in eXtended reality. In: ACM Symposium on Eye Tracking Research and Applications (ETRA 2021), pp. 1–11, Article no. 7. ACM, New York (2021). https://doi.org/10.1145/3448017.3457378
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.
The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.
Copyright information
© 2023 The Author(s)
About this paper
Cite this paper
Walczak, N. et al. (2023). Towards Gestural Interaction with 3D Industrial Measurement Data Using HMD AR. In: Biele, C., Kacprzyk, J., Kopeć, W., Owsiński, J.W., Romanowski, A., Sikorski, M. (eds) Digital Interaction and Machine Intelligence. MIDI 2022. Lecture Notes in Networks and Systems, vol 710. Springer, Cham. https://doi.org/10.1007/978-3-031-37649-8_21
Download citation
DOI: https://doi.org/10.1007/978-3-031-37649-8_21
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-37648-1
Online ISBN: 978-3-031-37649-8
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)