1 Introduction

Virtual-reality (VR) describes technologies trying to accurately simulate a virtual environment with the use of Virtual-Reality-Headsets or Virtual-Reality-Head-Mounted-Displays (VRHMDs). These devices typically consist of a screen with lenses mounted in front of the user’s eyes and additional motion sensors inside of the device. The headset is worn like a pair of ski-goggles and is often connected to a computer which renders a three-dimensional (3D) world to the user’s eyes. The main difference to “conventional” 3D movies is that the computer uses the input of the motion sensors – and in newer devices additional external tracking hardware – to continuously update the information on the user’s head position and orientation to adjust the displayed image. Modern devices like the HTC-Vive manage to do these updates about every 22ms (Luckett 2018). These factors combined allow the user to naturally look around, walk around and even interact with a virtual world and can create a feeling of being “present” in a different location, immersing the user in the displayed content.

While the most obvious use case for this technology is probably the entertainment and gaming sector, VR has gained significant traction in professional fields, such as medicine as well.

There are numerous use cases for VR in medicine, some of the most exciting are the usage of VR for realistic training of surgical procedures, to simulate more realistic 3D-imaging, in preoperative planning and even for remote surgical operations (Boedecker et al. 2021; Ghaednia et al. 2021; Mishra et al. 2022; Verhey et al. 2020).

Furthermore, VR can assist in phobia treatment by exposing patients to controlled anxiety-inducing scenarios(Park et al. 2019; Salehi et al. 2020). For example, patients with a fear of spiders can be exposed to virtual spiders(Hinze et al. 2021; Lindner et al. 2020). Patients with a fear of heights can virtually experience standing on the edge of a skyscraper to become more comfortable with the situation while still knowing that they are actually safe on the ground (Rimer et al. 2021). It has also been shown that VR can help to easier and more objectively diagnose specific phobias in patients by having them navigate an automated VR-program and analyzing their behavior (Binder et al. 2022; Lindner et al. 2020).

VR has also been shown to be beneficial in the rehabilitation of stroke patients, significantly increasing the balance of patients when including VR-therapy in the rehabilitation process.(Jung et al. 2012; Kim et al. 2015, 2016; Voinescu et al. 2021).

Apart from medical applications, other professional examples are construction companies that use the technology to visualize building concepts (Ashgan et al. 2023; Ghobadi & M.E. Sepasgozar, 2020), car manufacturers and designers who assess the overall impact of a planned car design without the need to construct expensive prototypes (de Clerk et al. 2019; Gong et al. 2020; Lawson et al. 2016) and much more.

As this exciting technology matures and becomes more broadly available, the use cases and therefore the adoption is expected to increase more and more. Recently Apple – one of the largest manufacturers of computers and largest company in the world (Randewich and Datta 2023; Statista 2023) - announced their first VRHMD with an obvious focus on productivity applications(Apple 2023; Nieva & Cai, 2023). With the adoption of this technology increasing, it becomes also more important to be aware of potential risks and problems based on this technology.

A key aspect of using virtual reality is movement: the ability to move around objects, look at them from different perspectives and possibly interact with them. This can quickly become a problem if the physical space available to a user is smaller than the simulated space needs to be to effectively experience the simulated environment. For example, if a surgical operation should be simulated but the user only has a small room available, the user would potentially not be able to walk around the virtual operating table without colliding with real life objects such as walls, a desk etc. without dedicating a large space solely for the purpose of VR-demonstration. In order to bypass this restriction, artificial movement (AM) is often used in VR, meaning the simulated viewpoint is moved in a way that does not correspond 1:1 to a physical movement in the real world.

There are many different approaches to implement AM in VR. The most common methods are floating movement and teleportation.

Floating movement describes a gradual transfer of the simulated viewpoint from one point to another while the user is standing still.

Teleportation on the other hand refers to an instantaneous change of the camera position from one point to another.

While solving the problem of limited space, AM can introduce new problems such as Simulator Sickness (SS).

SS is a complex of 16 symptoms such as nausea, headaches, vomiting and was originally described for users of flight simulators, but SS can also occur when using VR applications (Kennedy et al. 1993). Previous research suggests that the occurrence of SS in VR is the result of a variety of reasons, with one of the main factors being artificial movement (Bimberg et al. 2020; Christou and Aristidou 2017; Duzmanska et al. 2018; Kolasinski 1995; Rebenitsch and Owen 2016; Saredakis et al. 2020; So et al. 2001; Stanney et al. 2020; Vlahovic et al. 2018).

Previous research has also found, that the effects of SS can last well beyond the the initial exposure, especially if users experience high levels of SS. Notably Bos et al. reported in 2005 that some users took more than 2 h to recover from SS symptoms after being exposed to virtual ship movement (Bos et al. 2005).

The medical significance of VR lies therefore not only in its use cases in the medical field, but with the expected increase in adoption of VR and therefore its growing user base, it becomes increasingly important to be aware of potential adverse side effects the technology might bring to users.

Especially with VR being increasingly used in critical environments such as healthcare, it is important to conduct proper testing of the causes of SS in VR. For example, it would obviously be unfavorable if a surgeon were to use VR as a last-minute preparation for a surgery only to be impaired by symptoms of SS afterwards while attempting to perform the procedure.

The importance of avoiding SS in VR becomes even more critical when developing tools for patient intervention. For example as mentioned before, VR is being used in stroke patient rehabilitation (Kim et al. 2015; Montoya et al. 2022; Voinescu et al. 2021). However, it is well known that stroke patients often struggle with impaired balance (Bonan et al. 2013; Yelnik et al. 2006). While this might be a reason why VR-interventions benefit stroke patients in the first place, it is important to control the level of challenge to the vestibular system created by the VR software. This way the benefits for patients can be maximized while avoiding an unwanted occurrence of SS, which might reduce patient acceptance or the outcome of the treatment.

Consequently, this leads to the question what is the most suitable way to implement artificial movement without causing SS in users and how can it be properly evaluated?

Most of the research done on the topic of SS in VR uses commercially available games or assets intended to be used for action games (Farmani and Teather 2020; Saredakis et al. 2020; Yildirim 2020). This poses a potential source for errors in testing since games are meant to entertain and excite the user. The excitement or stress of playing a game however might skew the results since stress symptoms caused by the game content might be mistaken for symptoms of SS (Saredakis et al. 2020).

Saredakis et al. conducted a large study on the effects of content on the occurrence of SS in VR applications. They found that content which they classified as scenic or minimalistic produced significantly lower levels of SS than gaming content(Saredakis et al. 2020).

The aim of this study was to develop a software platform for VR testing that provides as little stimulation – apart from the actual movement – as possible to overcome the bias of testing based on entertainment software. Therefore, allowing to shed light on reasons leading to SS enabling to aid adoption of VR in the medical field. Furthermore, the design of the platform should be easy to modify to be adapted to the ongoing development in VR and to fit as many use cases as possible.

2 Methodology

2.1 Software requirements

To test the tolerability of different movement types, a software needed to be developed and the following criteria were defined for its development:

  1. 1.

    The software must be easy to use for a single examiner.

  2. 2.

    Software should provide ways to test teleportation and floating movement. They should be implemented in a passive way, meaning that they will be controlled by a timer not the participant. This will ensure that the testing is conducted in a repeatable manner and that the potential user base is as wide as possible since VRHMDs that do not integrate motion controllers will be supported as well.

  3. 3.

    Functionality to use motion controllers should be implemented as well, with tracking and a 3D model, allowing the program to be easily modified to add more (user controlled) movement types with little effort.

  4. 4.

    Natural walking should be integrated for compatible headsets. Natural walking means the users walks physically and the real movements are translated in the virtual environment. Therefore, it is not an artificial movement but can be used as a control to evaluate how the artificial movement performs compared to natural walking.

  5. 5.

    Based on the finding of Saredakis et al. the software should feature only simple interactions for the user with the scene being not “exciting”(Saredakis et al. 2020) in order to reduce the probability that the software’s content induces symptoms of SS independently of the movement method used.

  6. 6.

    Aside from the form of movement, the test should be identical for all forms of movement including natural walking.

2.2 Virtual reality hardware and test system

The software was developed and tested using the Vive virtual reality system by HTC. It offers a high resolution to enable a clear image for the user and is capable of very precise low latency tracking by utilizing external laser stations called “lighthouse”(Luckett 2018). Furthermore, the system also includes two motion controllers which are tracked by the same laser system to track the user’s hand position and enable the user to interact with the environment (Luckett 2018; Niehorster et al. 2017).

The software was used on a computer running Windows 10 with an Intel Core i7 6700k Processor, 32GB of RAM and an Nvidia Geforce 1080 Ti graphics card.

2.3 Software development environment

The Software was created using Unreal Engine 4 in Version 4.18. Unreal Engine is a development environment primarily used to create video games but can be utilized to create other (3D) software as well. As an advantage, Unreal Engine 4 includes native support for SteamVR. This allows for the software to be compatible to a vast variety of VRHMDs through the OpenVR-standard (Takahashi 2015). Additionally, the Unreal Engine 4 development environment features a functionality called ‘Blueprints’ which allows users to generate program code using building blocks lowering the entry barrier. This enables less experienced programmers to understand and change the code of a software developed in Unreal Engine 4 easier. (Epic Games, n.d.-b, n.d.-a; Epic Games and Cowley 2015)

2.4 Validation

The proposed software is currently being used in an upcoming study about the effects of different artificial movements in VR. Part of the data gathered in this study will be used to determine whether the software was successful in its goal to not induce symptoms of SS in users based on the displayed content.

In order to register the amount of SS experienced by participants, the study uses the Simulator Sickness Questionnaire developed by Kennedy et al., one of the most used tools to evaluate Simulator Sickness in VR (Bimberg et al. 2020; Kennedy et al. 1993; Saredakis et al. 2020). It is a standardized questionnaire that records 16 symptoms experienced by participants on a scale of 0 to 3 (none, slight, moderate, severe symptoms) and calculates a total score of simulator sickness as well as scores for 3 subscales (nausea, oculomotor and disorientation) (Bimberg et al. 2020; Kennedy et al. 1993).

In the mentioned study participants filled out a copy of the questionnaire before testing (more precisely even before putting on the VRHMD), and then conducted a test run using the software’s natural walking mode. Immediately after completing the test the participants filled out another copy of the questionnaire.

The SSQ-scores reported in the pretest will be compare with the results of the test run for natural walking. If the software doesn’t induce SS-symptoms there should be no significant difference in test scores. To confirm the absence of differences, equivalence testing will be conducted.

The study required two additional ways of movement to be added to the software: a user controlled floating movement using the HTC Vive’s motion controllers and a passive floating movement with added vertical movement.

3 Results

3.1 Software

For the actual implementation of the virtual testing environment, it was decided that the program was to feature a rectangular room of 3.3mx3m with white walls and minimal decoration to keep the environment simple but also provide some reference points for users to prevent disorientation (Fig. 1). Room size was chosen to allow enough room for the users to walk a few steps towards each target location, but still to be small enough that no excessively large room was needed to use the software. An asymmetrical shape was chosen to enable easier orientation as well. Colors and materials were loosely based on the physical test room where the tests would be conducted.

Fig. 1
figure 1

Virtual test room as perceived by the user in the VRHMD (left) and the physical test room (right)

The test is initiated by the user starting in the middle of the room with the task to walk across predetermined target locations within the room. For this, the next target in the room should be indicated to the user by a marking on the ground so that the user only has to follow this marking to complete the test. The marking was implemented using a white plane (as seen in Figs. 1 and 2) which moves to the next target location when an overlap with the user’s head position is registered.

For the test, a total of 26 target locations across 5 different positions in the room were implemented. The 5 different positions are located in the four room corners as well as in the middle of the room.

Fig. 2
figure 2

Test workflow (X = user, blue arrow = user movement, white arrow = target movement, grey dot = target)

3.2 AM implementation and software control

The movement for teleportation was implemented based a timer teleporting the user in 5 second intervals across the different locations.

For floating movement two different options were implemented: automated movement - again with 5 second intervals, to allow users to reorient themselves – and secondly user controlled, using a trigger on the motion controller to give the user control of the movement towards the target locations. User controlled movement was implemented using the controller orientation as the movement direction. When a trigger button on the controller was pushed, floating movement in the controller’s direction was initiated, allowing the user to steer intuitively towards the target location.

Control of the software by the examiner was performed using predefined keyboard shortcuts to execute different sections of the software.

Default state of the software when started is the natural walking mode without AM.

The examiner has the following options to control the software by pressing the associated keys on the keyboard:

R: (re)starts the software back in natural walking mode and resets the progression through the target locations.

1: enables user float mode, the participant can now use the triggers on the motion controller to float towards the target locations.

2: starts passive teleportation: the user is teleported to the target locations on 5 s intervals.

3: starts passive floating movement: the user is passively floated towards the target locations on 5 s intervals.

4: starts an additional test mode: a slope is brought into the room making the floor uneven. If any movement is performed within this area, additional vertical movement is involved as well. As mentioned in 2.4, movements 1 and 4 were added for an upcoming study evaluating the effects of different artificial movement. Though not part of the initial requirements, these movements were left in place to facilitate future use of the software and to demonstrate its easy customizability.

3.3 Collision avoidance

Usually, VR applications that involve movement require the use of virtual boundaries to avoid possible collisions.

For this, the first initialization of a VR headset requires definition of a use-area (playspace), which should be free of obstacles. After defining this area, the user can walk around freely. As soon as the user moves close to the borders of the playspace it must be assumed that a collision with real life objects is possible with the user not necessarily being aware to the danger, due to the eyes being blocked by the headset. To alert the user of possible dangers, the VR software can display virtual boundaries to prevent the user from walking out of the play space or colliding with objects (Norrie 2017).

While there is little research on whether these virtual boundaries can affect the occurrence of SS in users, it seems likely, as they represent an artificial-looking element in the virtual world, similar to other game mechanics that might trigger SS (Saredakis et al. 2020).

We therefore wanted to create a solution to remove the need for such boundaries.

To achieve this the starting position in the software is tied to the center position of SteamVR. This means if the participant stands in the middle of the physical test area he will also be placed in the middle of the virtual room when the software is executed. If the user stands one meter to the right of the center of the test area, the user’s starting position will also be shifted in the same amount to the right of the virtual room’s center. Therefore, eliminating the need to use – possibly disorientating – virtual boundaries. Due to this mechanism, it is sufficient to instruct a participant not to walk through the walls as seen in the virtual room to avoid collisions with real life obstacles. The only requirements are that the physical test space is at least as large as the virtual room and is set up in SteamVR accordingly.

It should be noted that AM inherently changes the boundaries of the room, meaning as soon as AM is used the displayed walls no longer match the edges of the physical test space. It is therefore important that the examiner ensures that participants walk back to the middle of the room before starting any of the AM scenarios and watches the participant closely while conducting AM tests. Since the use of artificial boundaries is handled through SteamVR (or other VRHMD specific software interfaces) it is still possible to enable them to provide additional safety, or if the examiner is not able to watch the participant for the entire duration of testing.

3.4 Test results

The software was used to conduct testing with 111 participants.

The participants received no compensation for participating in the study. 41% (n = 46) of the participants were male and 59% (n = 65) female. The median age of the participants was 26 years with the youngest being 18 and the oldest 65 years old. The resulting SSQ-Scores for natural walking show little to no change against the pretest score (Fig. 3; Table 1). The Equivalence test also showed significant equivalence (therefore no significant difference) between the two scores, suggesting that the test run using natural walking did not induce SS-Symptoms in the users (Table 2).

Fig. 3
figure 3

SSQ results, Pre = pretest 1 = natural walking

Table 1 SSQ results, MD= median, MW=mean, SD=standard deviation
Table 2 results of equivalence testing pretest vs natural walking

4 Discussion

Since the adoption of VR has been increasing over the last years and is expected to do so in the future, it is necessary to develop more structured test procedures allowing to systematically assess the tolerability of artificial movements as well as the VRHMDs themselves. These tests can only be performed in a meaningful way with software that allows to evaluate potential causes for SS as isolated as possible. As outlined before, this is especially important considering the increasing adoption in professional fields where occurrence of SS might present a serious problem instead of being just a nuisance.

Therefore, we developed a software integrating the findings of other researchers to provide an environment eliminating other causes for SS (as introduced by gaming based contend) and providing a robust test basis for motion induced SS.

The software allows to test different artificial movement methods in a comparable manner. Furthermore, it is easy to modify and extend, and the environment is designed simple enough to adapt it to different testing conditions with little effort.

Since the software uses SteamVR, it is compatible with basically all popular VR headsets. This means researchers can benefit from lowering prices of commercial VRHMDs to acquire adequate test setups. It is also possible to utilize motion controllers if needed and swap out fitting 3D models easily. This results in a meaningful way of comparison also between different VRHMDs.

In preparation for a study, additional functionality was added to the software, which was possible with very little effort, demonstrating the easy customizability of the software platform and thus proves its suitability for simple adaptations to new testing scenarios in the future.

In our testing natural walking showed no significant change in the SSQ-Scores reported by the users when compared to the SSQ-scores gathered before testing. This suggests that the software was successful in it’s goal to not induce symptoms of Simulator Sickness due to the displayed content.

For future development it is planned to implement a graphical user interface (GUI) to the software giving the examiner not only options to select the functionality, currently available by the hotkeys, but further carry out simple changes to the room dimensions, and target locations. While these customizations can currently easily be achieved using blueprints, a GUI will increase the software’s simple set up properties and allow modification without requiring any knowledge regarding Unreal engine or software development on the examiners part at all.

An additional aim of this project was to make the software easy to use for one examiner. In principle this worked well. By using hotkeys on the keyboard – instead of having to type in commands – the examiner could quickly access the software’s different functionalities.

However, during testing it became apparent, that it was necessary for the examiner to keep track of the cable connecting the VRHMD with the PC. This meant that the examiner had to switch from the computer to execute inputs for the software and quickly return to the participant to keep the cable out of the way. This was less problematic with the artificial movement methods (because the participant didn’t move much and effectively only turned while standing in place), but especially with natural walking this required the examiner to be well acquainted with the software and the target locations to effectively manage this alone. This should be made easier in the future by adding an option to use a delay of a few seconds between pressing the hotkey and the beginning of the test run. Other approaches to mitigate this problem might be to include a second examiner, use wireless VRHMDs, or to use a VR backpack (essentially a PC mounted on the user’s back) (Harkönen 2022; notebookcheck.com 2016) as the test system.

The resulting software can be used to evaluate the tolerability of different movement types compared to one another and therefore facilitate a better understanding how software should be designed that is less likely to cause SS in users. It can further be used to evaluate different VRHMDs against each other.

An important scenario would be for example a hospital that plans to implement VR in its workflow to choose an appropriate headset for training or for interventions such as balance training for stroke patients. It has been shown in numerous studies that patients can benefit greatly from VR-interventions, however it has also been shown that VR-applications can lead to severe SS-Symptoms. To increase patient acceptance, it is therefore necessary to develop and select VR-systems and software that minimize the effect of SS in patients.

In conclusion, we believe that the software we developed can be an effective platform to conduct repeatable and reliable testing for simulator sickness in VR while minimizing potential content-based influences on SS such as action game elements. The current state already provides an easy usability which still can be improved in the future by adding a GUI and more ways of movement to adopt to the ongoing development in the VR space. The software can be used as a basis to develop new software for medical treatments that is less prone to induce SS in patients, thus making the benefits of VR-treatments more accessible to more patients.