Advertisement

Springer Nature is making SARS-CoV-2 and COVID-19 research free. View research | View latest news | Sign up for updates

Challenges in passenger use of mixed reality headsets in cars and other transportation

Abstract

This paper examines key challenges in supporting passenger use of augmented and virtual reality headsets in transit. These headsets will allow passengers to break free from the restraints of physical displays placed in constrained environments such as cars, trains and planes. Moreover, they have the potential to allow passengers to make better use of their time by making travel more productive and enjoyable, supporting both privacy and immersion. However, there are significant barriers to headset usage by passengers in transit contexts. These barriers range from impediments that would entirely prevent safe usage and function (e.g. motion sickness) to those that might impair their adoption (e.g. social acceptability). We identify the key challenges that need to be overcome and discuss the necessary resolutions and research required to facilitate adoption and realize the potential advantages of using mixed reality headsets in transit.

Introduction

Advances in transportation are such that there will be a significant conversion from drivers to passengers over the coming years, i.e. in transit but not necessarily responsible for the continuous navigation or control of a vehicle. This is due both to the increasing availability/usage of public transport and the impending adoption of autonomous vehicles. As a consequence, the not inconsiderable time spent across the population as passengers is expected to continue to rise. A recent report showed that long commutes in the UK have risen (Trades Union Congress 2015) to the extent that UK car journeys typically last \(\sim\) 22 min (Department for Transport 2016) and commutes last \(\sim\) 55 min (Press Association 2015), whilst in the USA drivers spend \(\sim\) 56 min per day in transit (U.S. Department of Transportation 2009). For flights, Heathrow alone reported 78 million passengers in 2017 (Heathrow 2017), whilst train journeys in the UK have doubled since the 1990s, with 145 million long-distance journeys in 2018 (Office of Rail 2018). And, more broadly, 3.7 million workers in the UK travel more than 2 h every weekday (TUC 2016). In effect, a considerable portion of our lives is spent as passengers, undergoing journeys that can be perceived as repetitive and whose duration is often considered wasted time (Gardner and Abraham 2007; Watts and Urry 2008).

Given this, an increasingly important question is how best to support passengers to make the most of this travel time, regardless of the mode of transportation they undertake. Observations of passenger behaviours have tended to reveal that modern passengers now alternate between the expected, productive activities—reading, watching videos—and looking ahead, or out of a window (Russell et al. 2011), with such behaviours also observed in autonomous driving scenarios (Hecht et al. 2020). The conversion of drivers to passengers, fuelled in part by autonomous car adoption, has the potential to better enable passengers to perform these productive activities, with considerable benefits for society economically.

However, there are open questions regarding how technology can best assist passengers in making the most of their time in transit. For example, how should displays be integrated or embedded into the given mode of transit: via dashboard surfaces (Ng and Brewster 2016; Ng et al. 2017) or seatback displays (Prince 2014), head-up displays (HUDs) (Pauzie 2015), on (Häuslschmid et al. 2015) or over (Matt Kamen 2016) windows, etc.? And what are the consequences of this ubiquity of displays? It has been noted in air travel for example that “already there are too many screens in the plane with monitors on the seats and passengers bringing on their range of personal devices. When people are trying to rest it is already difficult with the glare of all these devices. It would be good to focus some work on individual private spaces” (Frangakis et al. 2014). There are also issues regarding how interaction should be supported in transit (Ng and Brewster 2016; Shakeri et al. 2016), whilst, most fundamentally of all, there is the “elephant in the room” (Diels et al. 2016) of motion sickness which, particularly with road-based transit, has the potential to significantly impede passenger adoption and usage of technology (McGill et al. 2017). Looking out of windows is a common, necessary and often enjoyable way to minimize motion sickness—albeit at the expense of disrupting any non-travel-related activities. There is a pressing need to examine such issues, given that the act of driving helps to prevent motion sickness, and viewing stable content (e.g. reading a book, watching a movie on a tablet) in a motion environment can trigger motion sickness symptoms (Diels and Bos 2016, 2015; Sivak and Schoettle 2015). And, despite the fact that autonomous cars will allow for a radical redesign of the interior (e.g. rear-facing front seats, windows becoming displays (Diels and Bos 2016)), as with other modes of transport such as planes, trains or boats, passengers will still fundamentally perceive themselves as being in a constrained, and often overly familiar, space for prolonged periods of time during their journey.

Against the context of passenger adoption of technology, we suggest that there is sufficient justification to explore the use of mixed reality (MR) headsets, encompassing both virtual reality (VR) and augmented reality (AR), for use by passengers across a variety of transportation, with a particular focus on in-car and in-flight usage (Fig. 1). Assuming that autonomous vehicles reach the level of full autonomy (SAE Levels 4/5 (International 2016)), trust in said vehicles is sufficient (Häuslschmid et al. 2017), and comfort/security concerns can be assuaged, usage of MR headsets by all passengers in transit becomes a feasible possibility. MR headsets offer a number of benefits over existing in-transit displays. They have the potential to both improve the usability and ergonomics of existing passenger activities by comfortably rendering and placing 2D and 3D content with depth anywhere around the passenger. The nature of the medium is such that content is innately personal and private, unless shared through software. And the headsets themselves are low power, with consumption similar to that of a smartphone, growing based on the fidelity of rendering and consequent graphics processing unit required. VR headsets in particular also allow users to appear to entirely escape the confines of the vehicle and become present and immersed in a different, entirely virtual, environment (often referred to as “place illusion”, a component of presence (Slater 2009)). Notably, because MR headsets have the capability to track head orientation and render content on that basis, they also potentially have complete control over how motion is visually perceived. This has significant implications for combating motion sickness (McGill et al. 2017), commonly resulting from a mismatch between how motion is physically (e.g. via the vestibular system) and visually perceived (Reason and Brand 1975; Zhang et al. 2016). Consequently, we might envisage the passenger experience improving in terms of:

  • Productivity The user sits down in their autonomous vehicle for the daily commute. Their AR headset renders a wide virtual workspace around them, allowing them to begin work immediately, and look forward to leaving work earlier as a result.

  • Entertainment Gazing at the real landscape in front of them, the train passenger’s AR headset highlights landmarks of note and generates new games and experiences out of the available landscape and location, e.g. rendering characters running alongside the vehicle in a high-speed platformer game.

  • Isolation As the seatbelt sign turns off after take-off, the passenger puts on their VR headset, entirely occluding the sights and sounds of reality, finding themselves in a relaxing virtual home cinema, modelled after their own home for comfort.

Fig. 1
figure1

Two use cases of MR displays in transit of particular note are (left) in-car (McGill et al. 2017) and (right) in-flight (Williamson et al. 2019), as both use cases have the potential to feature private/secure surroundings with journeys of long durations

However, there are a number of challenges that could impede the adoption and usage of MR headsets in transit, and consequently the realization of such visions. This paper outlines what we consider to be the most significant of these, based on our previous research regarding VR headset usage in-car and a review of the emerging literature intersecting the domains of VR/AR headsets, in-car interactions and human factors challenges in autonomous cars. Firstly, we briefly discuss the problems with existing in-vehicle display technology that justify the deployment of MR headsets in transportation. Secondly, we consider the most pressing problems regarding supporting MR headsets in transit, which are split into two categories:

  • Functional impediments to MR usage in-motion These are fundamental problems that effectively rule out safe and proper passenger usage of headsets currently, such as headsets being unable to retain a forward bearing in-motion, being untested with respect to crashworthiness and being likely to induce motion sickness.

  • Impediments to acceptance/adoption Assuming MR headsets can function correctly in-motion, and are safe to use, barriers to their acceptance and adoption by passengers should be considered. For example, the social acceptability of use across different forms of transport would seem likely to vary significantly, dictated by issues such as managing awareness (of fellow passengers and the travel environment), ensuring passengers are still approachable and accessible, etc. Conversely, providing compelling reasons to use headsets in transit might positively influence adoption, e.g. gamified experiences that rely on motion could provide more exciting and enjoyable journeys. And achieving/maintaining parity with headset features that users may come to know as standard, such as telepresence, will be important to meet existing user expectations of the technology.

In this paper, we focus predominantly on passenger usage of MR headsets for tasks other than driving. We explicitly exclude driver-assistive AR (Gabbard et al. 2014; Kun et al. 2016)—whilst some of the problems discussed are applicable to both passenger and driver MR headset use, such use cases bring with them a host of new considerations regarding distraction, trust, communication and safety and should be considered as separate from the passenger MR headset use case. We also concentrate on headsets (both auditory and visual) rather than integrated or mobile displays, for the aforementioned potential benefits (privacy, ergonomics, etc.) and because they offer the most transportation-agnostic way of enabling MR passenger experiences. If, as anticipated, such headsets see further adoption by consumers for everyday usage (e.g. immersive VR for entertainment, or wearable AR for mobile spatial computing), it is reasonable to consider how we can support and take advantage of these headsets when they are inevitably transposed to transit contexts by users. Finally, we also exclude discussion of the network infrastructure required to deliver live/streamed mobile MR experiences, with a number of challenges identified in the mobile delivery of low-latency, high-fidelity MR content (Mangiante et al. 2017; Elbamby et al. 2018). This problem will likely be resolved by the advent of 5G networks (Orlosky et al. 2017), and the usage of offline content means we do not consider this to be a significant impediment to initial adoption.

Broadly, the intention of this paper is to outline both why support for MR headsets in transit should be considered, and in what areas concerted research effort across academia (with an emphasis on usability and HCI) and industry is required to bring about safe and compelling MR headset experiences for passengers, regardless of the mode of transport.

The case for passenger MR headsets

The problems with in-vehicle displays

Phones, tablets and laptops are a common sight inside planes, trains, boats, buses and cars, offering passengers a distraction from the sights and sounds of their journey. This has long been a necessary feature of travel, with entertainment providing a “diversion from speed and the risk of the catastrophic accident via screens”, first noted during “nineteenth-century railroad journey(s), in which the new practice of ‘panoramic vision’ and the reading of newspapers by passengers onboard railcars served as a ‘stimulus shield”’ (Groening 2013). As a more contemporary example, in-flight entertainment has been suggested to act as “an intermediary, screening out the fact of flight and the events of travel...crucial to keeping passengers calm, occupied and content” (Groening 2013) and has been noted to significantly improve the perceived comfort of passengers (Ahmadpour et al. 2014a; Patel and D’Cruz 2017).

For the modern passenger, in addition to personal devices, any number of in-built displays are often available, e.g. dashboard displays in-car and seatback displays for passengers seated in the rear (Wilfinger et al. 2011). Recent advancements include projection-based AR HUDs (Pauzie 2015) and windows (Rao et al. 2014a; Häkkilä et al. 2014; Haeuslschmid et al. 2016; Rao et al. 2014b), whilst technologies such as flexible OLED displays mean that every surface could effectively become a display, as demonstrated in concept cars (Mercedes-Benz 2016). The necessity of windows may even be questioned, and they may just become displays, occluding outside visibility for presentation (Matt Kamen 2016). (See Fig. 2 for examples.) This technological integration and adoption has led to the modern passenger experience of distraction being very different to that of railroad passenger journeys a century ago. Now, passengers can find themselves ensconced in a “techno-cocoon” with technology acting as a “sensory filter...crucial for sensory privacy and exertion of control” (Groening 2016).

Fig. 2
figure2

Examples of different in-vehicle display concepts. Top left: Seatback displays (Thales 2018); top middle: Ford windshield movie screen; top right: Toyoto AR windows (Toyota 2011); bottom: Mercedes-Benz concept car with every surface as a display (Mercedes-Benz 2016)

Indeed, as Groening (2016) notes, “at least since the 1970s, passengers have sought more forms of separation between themselves and those other passengers perceived as ‘undesirable’ ”. However, all of the aforementioned displays fall short of this underlying aim of separation. This is because they are effectively integrated into the shared physical environment, and consequently they share common problems and drawbacks. They are often limited in terms of size, and thus immersion (Cummings et al. 2015). Privacy is rarely assured, with displays frequently gaze-accessible to other passengers. They are subject to glare and reflections from the ever-changing lighting of the outside environment and can require a gaze angle that, at the very least, can be sub-optimal in terms of comfort (e.g. staring downwards, or staring at one fixed place continually for long periods, noted to cause neck problems due to a lack of variety of head movement (Farias Zuniga and Côté 2017)). At worst, they are also more likely to induce motion sickness, termed nauseogenic visual displays (Golding and Gresty 2015). Given passengers often require visual awareness of the motion of the car to avoid motion sickness (Elbanhawi et al. 2015; Diels and Bos 2015), some view of the outside world or the motion of the vehicle may be necessary, with any restrictions of the view having further consequences for motion sickness. Fundamentally, these displays have to work around, and within, their physical context, whilst passengers still perceive themselves as being in the constrained and repetitive space of a car, plane, train or bus.

The benefits of MR headsets: immersion, limitless display spaces, and visual perception of motion

Conversely, MR headsets can potentially overcome many of these problems. They are unrivalled in terms of immersion/presence (Cummings et al. 2015), with privacy dictated by software constraints rather than physical visibility. Occlusion issues are no longer relevant as any view can be presented in the headset, whilst interactions can move with the user (given the headset is subject to the same oscillations as the body). Healthy, comfortable gaze angles can be enforced as content can be displayed anywhere around the user, and moved at any time. New possibilities for interaction (e.g. via gaze (Lucero and Vetek 2014) or direct touch (Chan et al. 2010)) and communication (e.g. telepresence where those you are addressing appear in your local environment (McGill et al. 2016; Orts-Escolano et al. 2016)) can be supported. And stereoscopic MR headsets can also render content with depth.

The VR Hyperspace EU project, which finished prior to the advent of new, cheap, consumer VR headsets such as the Oculus Rift, noted the benefits of VR usage in transit, with their stated aim being to “enhance the passenger comfort through...(the) adoption of virtual and mixed reality technologies in the future air cabin” (Cappitelli et al. 2014). More recently, airlines have also tested VR headsets for use in-flight (Air France 2017; Qantas 2015), with benefits even being shown regarding safety knowledge transfer regarding how to wear an aircraft life preserver (Chittaro et al. 2018). There have also recently been a number of instances of industry explorations of VR headset usage in both planes (Kuchera 2015; Gulliver 2017; Holly 2017) and autonomous cars, such as by Renault (Dent 2017) and Apple (Rober et al. 2018). As a blogger for The Economist noted: “virtual reality headsets on planes mean we can isolate ourselves from irritating cabin-mates” (Gulliver 2017).

MR headsets allow for immersive gaming and entertainment experiences (Cummings et al. 2015) and will inevitably also support general productivity applications (McGill et al. 2015). But they can also alter the passenger experience in more subtle ways. The VR Hyperspace project previously noted that such technology could change the perception of self and space, improving passenger experience and comfort through new virtual “surroundings, real imagery and live flight data in addition to fantasy environments” (Frangakis et al. 2014). Indeed, it has been noted that:

Virtual environments can fully or partially distract people from sources of discomfort, becoming more effective when they are interesting. They are also more effective at distracting people from discomfort caused by restricted space than noise disturbances (Lewis et al. 2016)

MR headsets can also better take into account passenger attitudes towards their environment. For example, Ahmadpour et al. (2016) classified these attitudes in terms of: adjusting, avoiding, approaching and shielding, linked to “passengers’ concerns for control, privacy, social connectedness and/or social tolerance”. MR experiences can facilitate many of these behaviours. For example, they allow content to be positioned optimally based on user comfort and ergonomics. They can avoid violations of personal space caused by other’s digital activity by having a private virtual personal space. And they can shield passengers from undesirable behaviours such as auditory or visual noise. Creating the perception of a more personal and private space, immune to invasion by others, is a significant benefit of MR headsets (Lewis et al. 2017). On this basis, we could envisage that an MR headset user/passenger might, for example, retreat to their private, virtual workspace, with a similar layout and interactions in-car as in their workplace.

The case against passenger MR headsets

However, for such a scenario to become a reality, a number of problems must first be overcome. If we consider existing consumer MR headsets outside of their use in transit, they still have some notable limitations. In the case of VR headsets, use is limited with respect to field of view (typically ranging from \(\sim\) 90\(^{\circ }\) to \(\sim\) 130\(^{\circ }\) in current consumer models) and fidelity, comfort and weight, especially on-the-move where mobile headsets are often powered by smartphones. For AR headsets, some of these issues are more pressing, for example the Microsoft Hololens 2Footnote 1 currently supports only a \(\sim\) 70\(^{\circ }\) horizontal field of view. However, we can reasonably make the assumption that lightweight and hi-fidelity MR headsets will inevitably become a consumer reality over the coming years, given the recent leaps made in terms of hardware, software and processing power. For example, consider the advances made between the release of the Oculus Rift DK1 in 2013 and the Oculus QuestFootnote 2 in 2019. Over the space of 6 years, there were significant leaps in resolution, refresh rate, positional tracking, mobile use, fidelity, tracked controller interaction, hand tracking, etc.

Assuming that mobile consumer headsets reach the anticipated level of maturity and utility, consideration should be given towards anticipating the future impediments to the use of MR headsets in transit, so they can be dealt with in parallel with the development and mass adoption of these headsets. The VR Hyperspace project offered the most comprehensive discussion of use in transit thus far (Frangakis et al. 2014). They suggested that the most problematic barriers to the use of VR in-flight were those of:

  • Cost To passengers, transportation companies and manufacturers

  • Reluctance to be immersed in VR For example, due to mistrust in technology, being unaware of safety-related conditions, etc.

  • Standardization In terms of headset platforms and interoperability with vehicle environments

  • Security and privacy For example, supporting the sharing of personal data to drive VR experiences

Our paper builds upon this work. Based on our review of the nascent literature on passenger MR experiences, and our prior research into VR headset usage by passengers both in-car (McGill et al. 2017) and in-flight (Williamson et al. 2019), we identify and outline the most significant research challenges, classified in terms of being either functional impediments to use, or design challenges that, when overcome, might increase acceptance and adoption of these new technologies.

Functional impediments to MR usage in-motion

By functional impediments, we refer to challenges and problems that would prevent or limit normal usage of MR headsets in transit. We discuss three functional impediments that we suggest require concerted effort and consideration to overcome: how existing IMU-based head orientation tracking is confounded by the presence of external motion; the unknown safety of MR headsets in the event of a crash/airbag impact; and how discrepancies between what motion is physically perceived (e.g. via the vestibular system) and visually perceived (e.g. being present in a stationary virtual room) will lead to the onset of motion sickness. For each of these challenges, we discuss the state of the art in terms of solutions, and what further research is necessary to overcome this problem. We also refer transportation in terms of the motion experienced by the passenger, grouped into three common types:

  • Motion type 1 Stable/constant velocity, infrequent vehicle orientation changes, little-to-no oscillations, e.g. large commercial planes or cruise ships.

  • Motion type 2 Frequent orientation changes, e.g. autonomous car journeys on motorways, intercity trains.

  • Motion type 3 Frequent vehicle velocity and orientation changes with oscillations, e.g. autonomous car journeys in cities, inner city trains/buses.

Challenge 1: maintaining a forward bearing in a moving world

Most affected: motion type 2/3 Transportation that involves frequent and/or significant changes in orientation and acceleration (e.g. cars, buses, boats).

To understand the difficulties in performing head tracking in transit, first we must discuss how MR headsets typically support rotational and positional tracking. This is achieved through IMU-based sensor fusion of a gyroscope and accelerometer, captured at a high sampling rate (\(\sim\) 1000 Hz in the latest headsets) and low latency for dead reckoning (LaValle et al. 2014). To compensate for sensor bias/drift over time, additional sensing is used for frequent corrections, so that what the user perceived as forward when they started using the headset is still in the same physical direction after prolonged use. For simple rotationally tracked headsets (e.g. the Gear VR), the magnetometer is responsible for providing this correction factor, offering a constant bearing for magnetic north. For positionally tracked headsets such as the Oculus Quest or HTC ViveFootnote 3 and their successors, optical tracking is used to provide this correction. There are two common approaches to this end, referred to as “inside-out” and “outside-in” tracking. Inside-out tracking refers to headset-based sensing for determining the position in the world. In the case of the HTC Vive and successors such as the Vive FocusFootnote 4, “Lighthouse” beacons in the physical environment broadcast pulses of IR light which the headset detects (Buckley 2015). However, recent consumer headsets such as the Oculus Quest now typically use SLAM-type (Durrant-Whyte and Bailey 2006) inside-out tracking enabled by depth cameras integrated into the headsets, using computer vision to track the surrounding environment, and consequently the position of the headset within said environment. Outside-in tracking typically refers to tracking systems where the corrective sensing is embedded in the environment. The Oculus Constellation system for example uses multiple external IR cameras with IR emitters embedded in the tracked objects (Feltham 2015). Regardless of the tracking technology used, positionally tracked headsets have the benefit of knowing their absolute position in 3D space and adapting their presentation accordingly, minimizing the discrepancy between what is visually and physically perceived, and consequently minimizing simulator sickness (Davis et al. 2014) whilst better facilitating presence (Cummings et al. 2015).

Problematically, due to the current reliance on IMU-based sensor fusion, both rotationally and positionally tracked headsets share a common problem when in-motion: any in-built IMU sensing is no longer detecting head movement aloneFootnote 5. Instead, it is now detecting a combination of user head movement and vehicle accelerations/rotations (Fig. 3), and then applying periodic corrections based on additional, usually optical, sensing. In the case of rotationally tracked headsets (e.g. Samsung GearVR), they will lose track of the forward bearing of the user, meaning that the MR headset will not be able to maintain a stable focus on 360\(^{\circ }\) content under motion, with the user’s view turning with the vehicle. In the case of positionally tracked headsets that rely on additional optical tracking, external motion will lead to the user experiencing judder as the corrective optical sensing contradicts the IMU sensing. And future MR headsets are likely to exhibit the same problems so long as they rely on IMU-based sensor fusion and/or use corrective tracking technology that has not been adapted for use in the uniquely problematic environment of a moving vehicle.

Fig. 3
figure3

The MR headset undergoes translations (indicated by the blue/green/red arrows indicating translations in x/y/z) and rotations (indicated by the blue/green/red circles) relative to the interior of the vehicle. However, the vehicle also undergoes translations and rotations relative to the world, which may also be detected/conflated by the headset sensing (colour figure online)

Resolving this challenge

Consequently, providing a general-purpose solution for enabling rotational and/or positional tracking for consumers across a variety of motion environments, from planes and trains to autonomous vehicles, represents a significant research problem, requiring additional sensing to either correct the headset inertial sensors and provide a stable forward bearing, or to replace the reliance on IMU-based inertial sensing for tracking headset orientation. Figure 4 details the potential sensing solutions we envisage.

Fig. 4
figure4

Existing (delineated in red) and anticipated permutations of head-mounted displays (HMDs) and vehicle sensing for providing MR passenger experiences across different types of transportation

Examples of correcting the inertial sensors can be seen in both McGill et al. (2017) and Hock et al. (2017), where a car-mounted gyroscope was used to allow car orientation changes to be subtracted from a rotationally tracked headset’s orientation. However, in this case, gyroscopic drift still occurs, so any such vehicle-based corrective IMU sensing would still require periodic corrections itself, e.g. the user explicitly instructing the headset to reorient when facing forward, or using additional sensing such as GPS or optical tracking as a means to measure the exhibited drift.

Ideally, we might perform such corrections on positionally tracked headsets, given the necessity of positional tracking for avoiding simulator sickness. As Fig. 4 details, there are a number of possible permutations of sensing that can provide support for all three of the motion types discussed, and it is by no means clear what the optimal solution might be here for any given transportation. This is because positionally tracked headsets face additional challenges in-motion. For outside-in headsets, cameras or sensors need to be mounted in the vehicle environment and need to be compatible for use in-motion. For example, HTC Vive Lighthouse beacons (both V1 and V2) cannot yet be used in-motion, relying on a hard drive motor (Skarredghost 2017) that shuts down to prevent damage when external motion is detected—a solid state equivalent would be required. Passenger seating and movement may pose challenges in terms of occlusion and lighting, whilst such solutions would only be viable for a subset of transportation where such technology could be pre-installed or quickly deployed.

Inside-out headsets might appear preferable then, requiring no hardware additions to the external environment. However, depth camera-based optical tracking solutions come with their own technical challenges, being particularly suited to enclosed and stable environments with little-to-no outside visibility such as in planes. This is in contrast to environments such as car interiors that feature a large proportion of refractive and translucent surfaces such as windows (problematic for any IR-based depth camera), and a constantly shifting visual environment (problematic for binocular depth imaging and tracking features in the world). In effect, any visible external motion (e.g. moving landscape through windows) would be likely to impair tracking, requiring such headsets to be able to fixate on known, stable interior features for tracking. Nonetheless, we could perhaps envisage a peripheral which combined a single optical tracking camera or optically trackable marker and an IMU, which could be temporarily mounted on the vehicle within visibility of the MR-headset-wearing passenger. Such a device could facilitate positional tracking in a range of environments and provide corrective data regarding vehicle orientation.

At the extreme, we could envisage future tracking solutions that no longer rely on IMUs. For example, instead of headset optical tracking solutions providing corrective pose updates and positional tracking, they could be used to exclusively provide headset orientation data. However, on top of the previously discussed problems with using optical tracking in-motion, removing the reliance on IMUs poses significant technical challenges regarding latency, accuracy and processing requirements for generating the headset pose, particularly when considering Timewarp-type solutions that require high-frequency/low-latency sensor data to function (Antonov 2015).

Much as with the variety of tracking technology in current consumer MR headsets, this is a research problem with a number of viable potential solutions, each potentially better suited to different forms of transport and different use cases. For example, if we consider the case of autonomous cars, research collaborations will likely be required between headset manufacturers and the automotive industry to arrive at car designs which provide adequate support for this use case. This may be in the form of designing standards for sharing car-bearing telemetry at low latency, integrating IMU or positional tracking sensors or creating car interiors that are conducive to being optically tracked by passenger headsets (e.g. integrated IR LEDs, markers or other discrete optically trackable features). Conversely, for usage in planes, given plane bearing changes are relatively infrequent and head movement is largely restricted to rotations only, it may be that software-only solutions (such as periodic user-driven re-calibration/zeroing) or initial versions of consumer inside-out tracking (capable of tracking the plane interior given the lack of windows/consistent lighting conditions) may be sufficient to enable widespread usage with little-to-no disruption.

Challenge 2: Physical crash safety

Most affected Transportation where recoverable crashes can occur without warning (e.g. cars), as opposed to where incidents may be forewarned and the headset removed (e.g. planes).

In the event of a crash, the seat belt restrains the passenger’s body whilst, in cars at least, the airbag is deployed to slow down the change in momentum of the head. However, use of an MR headset during a crash may impact the effectiveness of these safety measures. Firstly, MR headsets increase the mass of the users head (e.g. VR headsets typically weight around 0.5kg (The 360 Guy 2019)), with the average human head weighing approximately 5kg (Gekhman 2006), meaning a \(\sim\) 10% increase), and consequently will increase the force exerted on the neck during a crash. The consequences of this (e.g. regarding whiplash incidence) are yet to be established. Secondly, the user may be denied any warning cues or anticipatory information regarding there being a crash imminent, and thus will be unlikely to brace safely. Thirdly, in cases where an airbag is present, the airbag will inevitably make contact with their protruding headset. At this point, the headset could break, resulting in broken glass from the lenses and display, plastic debris, and ruptured batteries in the case of mobile headsets. Or it could remain intact, applying the additional impact force resultant from the crash onto a smaller surface area where the headset makes contact with the wearer’s face.

Resolving this challenge

The deformability characteristics of current consumer headsets are untested and do not appear to be considered by manufacturers currently. Moreover, the consequences of such an impact are unknown. Whilst this situation is understandable, given the passenger use case has yet to see significant adoption or commercial endorsement, this point will need to be addressed to allow for safe, legal usage of MR headsets by passengers. There is a body of research examining the risks of eyewear such as glasses being worn during such events. Notably, whilst the additional risk encountered was small (Koisaari et al. 2017; Tervon and Sulander 2014), the severity of individual cases (Tsuda et al. 1999) combined with the dimensions, materials and form factors of current MR headsets suggests that crash safety should be evaluated for any headset where usage in transit is likely, such that users are at least made aware of any discovered risks. And, in time, it could be expected that consideration be given to crashworthiness at the design stage of headsets, acknowledging their potential usage in transit and safeguarding against being worn during a crash.

Challenge 3: motion sickness

Most affected: motion type 2/3 in VR Transportation that involves frequent and/or significant changes in orientation, acceleration or oscillation (e.g. cars, buses, boats).

The problem of motion sickness has remained ever-present in transportation. Whilst the fundamental cause is unknown, there are a number of theories that have helped to expand our understanding of the varied pathways causing motion sickness. For example, it is commonly theorized that a sensory mismatch between what the vestibular and the visual system perceive results in motion sickness (Reason and Brand 1975; Zhang et al. 2016). Others suggest that discrepancies in determining the subjective vertical (i.e. gravity) (Bles et al. 1998), changes in acceleration (Sawabe et al. 2017), conflict between the Canal-Otolith systems (Guedry and Benson 1978), lateral oscillations (Hosseini and Farahani 2015), postural instability (Riccio and Stoffregen 1991) and sway (Owen et al. 1998) all contribute to motion sickness.

Regardless of the underlying causes, if we examine road transportation in particular, autonomous cars will likely increase motion sickness incidence (Diels and Bos 2016, 2015; Sivak and Schoettle 2015). This is due to increased behaviours that induce motion sickness, for example increased use of displays and devices; changes to seating, with concept cars often proposing front seats that rotate to face backwards to form a more social interior; and the visibility of the external environment (Elbanhawi et al. 2015; Diels and Bos 2015), leading to increased sensory mismatch. In addition, the act of driving itself provides anticipatory cues which prevent motion sickness from arising in drivers (Bertolini and Straumann 2016; Golding and Gresty 2015; Rolnick and Lubow 1991), so when drivers become passengers more people will become sick. As Diels says:

...the use cases that are being envisaged for automated driving are also those we know to lead to increased levels of car sickness (Diels 2014)

MR headsets also have the potential to contribute to new forms of sensory mismatch, particularly in the case of VR headsets where reality is entirely occluded (McGill et al. 2017). Consider playing a VR game where the player controls their virtual movement independent of that of the vehicle. This could result in conveying no motion visually when the vehicle is moving, conveying motion visually that is entirely different to what is physically perceived (e.g. the car turns right but in the VR scene the view turns left) or even conveying motion which matches the physically perceived motion but at a different magnitude. The effects of such circumstances are largely unknown, but it would seem likely that, for some users, such usage will present a new test of their resilience to motion sickness, a consequence of experiencing stable/moving virtual content that may or may not be aligned with vehicle motion in reality.

Resolving this challenge

Notably, headsets also have the potential to contribute towards both understanding the causes of motion sickness and fixing the sensory mismatches that result in said sickness. As McGill et al. demonstrated (McGill et al. 2017), if the motion that is visually conveyed via the headset is congruent with what is physically perceived, motion sickness will be minimized, and it may be possible to blend visual motion cues with other stable content allowing for a general-purpose presentation of motion for any content type. Headsets can also visualize unseen motions, for example showing orientation changes when below decks on passenger ferries/cruise liners (Carter et al. 2018a; Stevens and Butkiewicz 2019)

In addition, if the vehicle is moving uniformly (e.g. no orientation changes, constant velocity), such as in an aircraft, motion sickness may not arise (Wienrich et al. 2017). Provoking anticipatory responses to motion through vection illusion has also shown some promise (Sawabe et al. 2017). And Soyka et al. incorporated motion into VR in the form of a magic carpet ride, finding that “brief exposure to turbulent motions [does] not get participants sick”, suggesting that VR use in-flight may not pose the same motion sickness risks as in other modes of transport (Soyka et al. 2015). VR has even been used to reduce seasickness (Carter et al. 2018b) by providing passengers below decks with a visual awareness of the orientation changes of the boat.

Head-worn equipment also suggests new possibilities for deploying vestibular counter-stimulation or desensitization, for example through Galvanic Vestibular Stimulation (Cevette et al. 2012), in conjunction with visual manipulations to reduce motion sickness. Moreover, entirely virtual displays can be presented anywhere around the user, not only potentially maximizing comfort but also minimizing car sickness, given that positioning in-car displays at eye height has been shown to have a significantly beneficial effect to motion sickness (Kuiper et al. 2018), in part due to the head being aligned with gravity.

We suggest there are significant open research questions regarding:

  • Visual motion cues What motions need to be conveyed (orientation, acceleration, velocity, etc.)? What are the design parameters (e.g. perceived magnitude of motion (Wilson et al. 2018b), portraying accelerations versus velocity, the necessary perception of optic flow (Diels 2008; Redlick et al. 2001))? How can visual motion cues be presented in virtual experiences, either interleaved with content or shown in a content-agnostic manner (e.g. as with the peripheral motion cue examined in (McGill et al. 2017))?

  • Anticipatory motion cues For example displaying impending motion (Karjanto et al. 2018) or tricking the MR passenger into performing anticipatory motion actions.

  • When motion cues should be presented For example based on user preferences or motion thresholds in an attempt to minimize the visual disruption or distraction the cues might cause.

  • The importance of rest frames Having previously been shown to be important for simulator sickness as a result of vection (Hock et al. 2017; LaViola 2000) (see Sect. 5.4 for examples of usage for immersion), what role should rest frames play in perceiving real-world motion? Is it disconcerting or motion sickness inducing to experience motion without some form of rest frame?

  • The feasibility of reducing vestibular sensitivity by employing techniques such as GVS (Cevette et al. 2012) or tDCS (Arshad et al. 2015).

MR headsets offer new possibilities for manipulating our perception of motion, capable of presenting content anywhere within the vehicle interior, and may assist in both understanding, and resolving, motion sickness. However, until motion can be adequately integrated into the MR rendering, or the vestibular system can be appropriately stimulated or suppressed, MR headsets will be likely to contribute to increased levels of motion sickness in the passenger use case.

Impediments to acceptance and adoption

At this point, we assume that the MR headset can effectively discriminate between user and vehicle motion, has been certified to be safe to use in the event of a crash event and provides effective measures for delaying or preventing the onset of motion sickness. As such, the headset is comparable in terms of physical safety to usage in non-passenger contexts, and functions correctly in-motion. The challenge now is to provide acceptable and compelling MR experiences that would justify their adoption and usage in-motion. Whilst the scope of this challenge is near-limitless, we focus on four challenges that we consider most pressing: understanding, and designing for, acceptable use in shared transit; supporting interaction in constrained, confined spaces; supporting bidirectional social at-a-distance experiences (e.g. holoportation-style telepresence (Orts-Escolano et al. 2016)); and exploiting vehicle motion and context for entertainment.

Challenge 4: acceptable use in shared transit

Most affected Transportation shared with members of public (e.g. planes, trains, carpooling).

Our use of MR headsets in shared transit is challenged by a multitude of factors: the social and cultural norms regarding our fellow (known and unknown) passengers, the context of our travel environment and the practical challenges introduced by occluding reality (particularly focussing on VR headset usage). These factors are ever-evolving, with subsequent exposure and adoption of MR technology in other contexts likely to influence passenger acceptance and adoption over time. Regarding social acceptability, consider a VR game that uses grabbing gestures in mid-air. Whilst likely acceptable at home or in private, on a public bus such behaviour could be extremely unacceptable and even disruptive to other passengers. When interacting with technology that is highly visible to others, such as a headset occluding the face, the sustained spectatorship of other passengers creates a potentially uncomfortable situation for both users and passengers alike (Wiliamson et al. 2011).

Our travel context in particular would appear likely to affect our willingness to wear an MR headset or engage in an MR experience. For example, travelling with intimacy groups such as friends or family would likely bring with it social pressures to converse or engage in shared activities to mutually pass the time. Conversely, passengers might exhibit very different attitudes towards MR headset use when sharing a flight with collocated strangers (where indeed the VR headset might even be provided by the airline (Air France 2017; Qantas 2015)), versus during a daily commute when carpooling or on the train, versus a late-night bus journey. There are a multitude of contextual factors that could contribute to varying attitudes towards usage here, from the nature of the physical environment and proximity/relationship to other passengers, the duration of travel, perceived personal safety within the mode of transport, etc. And there will invariably be cultural effects, with it repeatedly being noted that different cultures often have varying different attitudes and expectations when it comes to socializing with other passengers in-transit (Baseel 2014; Studarus 2018; Smith 2016). Will the person you are seated next to on your long-haul flight resent your immediate escape into the solitude and isolation of an immersive VR experience?

Occluding reality whilst in transit could even result in a variety of issues that could make headset use unpalatable or even unsafe. Loss of awareness of other passengers means that VR/AR users may accidentally disrupt others or physically invade their space (Williamson et al. 2019). Fellow passengers may not know if they are visible to a headset user and be unsure how or if they can interact with that person (e.g. needing to squeeze past them to stand up). Emergent situations more generally—turbulence on a plane, the entry of drunk passengers to the bus—might vary willingness to engage in MR activities, if the user is even aware that such situations are occurring. As a blogger noted of their VR in-flight experience:

Once it becomes clear that you can see someone through the hardware, even though they can’t see your eyes, people don’t seem to know how to react. I turned on the camera once or twice just to look around, and a few people were openly gawking at me...It also felt way too strange to play any game that forces you to look around in an active way. It seemed almost weird to be sitting in tight space, whipping my head around to look at things only I could see. (Kuchera 2015)

Interaction with other passengers and staff is often unavoidable, for example to ask for directions or to move out of the way. In this anecdotal experience the flight attendant reacted to the VR user by “pass[ing] by without asking if I wanted anything”, given there was no obvious/acceptable way of interrupting the VR user. Safety becomes an issue if headset wearing passengers are unaware of safety announcements and can’t react quickly to dangerous situations. There are also practical reasons to need awareness of immediate surroundings, for example to protect your belongings or to know when to get off of a bus. The fact that users would actively chose to occlude reality and accept illusions may also be unacceptable (Frangakis et al. 2014), leading to tensions between passengers.

Resolving this challenge

Aspects of this challenge may be resolved in time, given changing attitudes and exposure to MR headsets. However, research can perhaps make it easier for users to conceive of using these headsets in shared spaces. For example, social acceptability can be broached through more discreet, wearable and even fashionable form factors (e.g. the Bose Frames (Bose 2019)), and the design of interactions that are equally nondescript to carry out (see Challenge 5). For example, a reliance on gaze-based selection might not be appropriate if it gives rise to the appearance that the user is staring at another passenger—perhaps requiring a different input modality better suited to the plane environment.

Social and cultural norms will inevitably evolve, and we would expect that new norms would arise. But tensions here could be eased, particularly for VR usage, if we could appropriately tackle the issue of occlusion, and awareness, of reality. The subsequent integration of cameras into VR headsets, particularly aiming at inside-out tracking, would appear likely to lead to headsets that can provide mediated awareness of the surrounding sights, sounds and emergent situations of reality, potentially improving social acceptability. For example, co-located people can be visualized in a virtual scene using depth sensing for a mixed reality experience (McGill et al. 2015). Cameras incorporated into headsets can also be used for mixed reality, for example the “Pass Through” views on the Oculus Quest or Gear VR which use a front-facing camera to provide a view to the real world in virtual reality. Some travel contexts may be better suited for early adoption of VR/AR, for example air travel (Williamson et al. 2019; Qantas 2015), where passengers must spend extended periods of time in an enclosed and monitored space where other passengers have been previously security screened.

Of particular note are solutions initially proposed by Williamson et al. for in-flight VR (Williamson et al. 2019). They identified key mechanisms by which the acceptability of in-flight VR usage could be improved, by both facilitating easier transitions between virtual and physical environments by utilizing mixed reality and supporting interruption from co-located “outsiders” such as other passengers or staff. In effect, supporting awareness of the real world and providing mechanisms by which the real world can acceptably encroach upon the VR experience would appear key to providing VR passenger experiences that could see future adoption. Further research will be required to understand how acceptability varies by passenger context and what more the headset can do to make experiences acceptable using the available sensing and peripherals of consumer MR headsets, with significant open research questions regarding:

  • Providing awareness of proximate persons How should MR headset wearers be informed of the actions, attention and proximity of other passengers or staff?

  • Providing external awareness More broadly, how can passengers be kept aware of external events occurring within the vehicle in ways that do not necessarily break presence/immersion, e.g. if announcements are being made on the train or plane?

  • Facilitating interaction between passengers and proximate persons How can other passengers or staff gain the attention of, or communicate with, the headset wearer? How should necessary interruptions be facilitated?

Challenge 5: interaction in constrained spaces

Most affected Transportation with restricted seating in close proximity to others (e.g. economy airline seating).

It can reasonably be expected that MR users in homes and offices will be accustomed to rich support for interacting with virtual content. Currently, it is standard for VR headsets to support either on-headset interactions using buttons or touch-sensitive surfaces, or hand-based interactions using controller peripherals, providing haptic feedback as well as capacitive touch input, and further work is ongoing to incorporate necessary elements of reality into VR experiences, e.g. physical keyboards (McGill et al. 2015; Boland and McGill 2015; VIVE 2018). Conversely, AR headsets such as Hololens have demonstrated peripheral-free, touchless interactions using hand tracking technology. However, existing interaction paradigms for VR and AR headsets do not take into account:

  • The physical constraints of a seated MR passenger

  • The capabilities and affordances of a given instrumented, connected, interactive vehicle environment (e.g. a car dashboard or plane cabin with seatback display)

With reference to the constraints, the most obvious is that of the physical seating. Regardless of if the user sees it as socially acceptable to perform body-based gestures (Rico and Brewster 2010) or use tracked hand-held peripherals or controllers, the physical environment, seat belts and the proximity of those seated nearby would likely dictate that more discreet gestures or interactions be performed. Regarding existing capabilities and affordances of a given passenger vehicle, the way drivers and passengers use in-vehicle infotainment systems is ever-changing. For example, physical buttons and dials on automotive centre consoles have been substantially reduced with the introduction of touchscreens and touch-sensitive surfaces, already a common sight in long-haul plane journeys. In some cases, touchscreens have replaced all buttons, dials and switches on the centre stack, e.g. the interior of a Tesla Model 3Footnote 6.

Resolving this challenge

Research will be required to explore the suitability of existing MR interactions transposed to constrained, in-motion contexts (Marshall et al. 2016). For example, hand-based gesturing may be impaired by the physical constraints of the environment (e.g. available space, restraint, motion), social acceptability and limitations regarding headset-based sensing. More unobtrusive, eyes-free interactions could bridge this gap, for example the NotifyEye eyes-free rub pad (Lucero and Vetek 2014). Supporting such interactions could require integration of sensing (e.g. Leap Motion (Toppan and Chiesa 2015), Soli (Wang et al. 2016)) and feedback (e.g. Ultrahaptics (Toppan and Chiesa 2015)) into the vehicle. These constraints will also impact how MR (with an emphasis on VR) content is viewed, given that physical limitations regarding neck and head movement would necessitate that either the content be restricted to a narrower field of view than full 360\(^{\circ }\) experiences, or some means of scrolling/changing orientation be provided (e.g. rotational gain (Hong and Kim 2016)), to prevent users from feeling frustrated regarding being unable to fully attend to entirety of the virtual 360\(^{\circ }\) space.

Where available, touchscreens and touch-sensitive surfaces on centre consoles could provide an additional, richer input modality for users during MR interactions, and give designers the opportunity to develop new input techniques for passengers that are perhaps suitable for MR experiences in the future. However, this will require further advancement regarding how we incorporate necessary elements of reality into MR (and particularly VR) experiences (McGill et al. 2015), and necessitates the interactive environment of the vehicle be tailored or made accessible to the MR headset. Related work has investigated how well drivers and passengers point (Ahmad et al. 2015) and perform common gestures such as swiping (Burnett et al. 2013) on touchscreens. The use of pressure input is becoming more popular with touchscreen smartphones, so researchers have also begun to explore in-car touchscreens and centre console surfaces with force-sensing capabilities to look for alternative input modalities that could be more effective and safer to use in vehicles (Ng and Brewster 2016; Ng et al. 2017). New technologies such as printing sensors and actuators on surfaces such as the dashboard and centre console could provide users with multiple touch surfaces with haptic feedback to interact with AR and VR applications (Frisson et al. 2017).

We suggest there are open research questions regarding firstly how to design new MR peripherals suited to use in constrained spaces, from new positionally tracked peripherals that can function in-motion, to appropriating existing peripherals such as smart watches/phones/rings. And secondly, how to bring tangibility to virtual displays and UI interactions. For example, direct hand-based mid-air interactions with virtual displays or UI elements could be made tangible by appropriating existing surfaces (including existing interactive elements such as touchscreens) and appropriating existing and new feedback modalities (e.g. mid-air ultrasonic haptics has been repeatedly suggested for non-MR passenger use). This latter point in particular is one with much wider applicability to VR/AR usage across a variety of contexts. However, the passenger MR use case may suggest that appropriating existing surfaces is of particular worth to explore, given the existing physical surfaces (e.g. seatbacks, tray tables, doors, arm rests, etc.) typically within reach.

Challenge 6: supporting shared experiences

Most affected Transportation that features a degree of privacy to allow for speech (e.g. autonomous cars)

Social VR has been suggested to be a significant future driver of adoption of MR headsets, precipitating events such as Facebook’s $3billion purchase of Oculus in 2014 [30]. MR headsets have the capability to change how we communicate at-a-distance. Prior to such headsets, communication was limited to voice and video. However, MR headsets have the capability to render virtual content with depth and at a real-life scale and thus support embodied telepresence, where those the user is communicating with at-a-distance are seen to be sharing the same virtual (e.g. McGill et al. 2016; VRChat 2018) or physical (e.g. Orts-Escolano et al. 2016; Fanello et al. 2016) space. This application could be suited to transportation where social acceptability is less of a concern, for example use in private autonomous cars.

Resolving this challenge

Firstly, there is the question of how the passenger should be captured and portrayed at-a-distance. Currently, shared at-a-distance VR applications typically display an avatar conveying head movements and voice [30]. However, advances in depth camera technology can allow for body tracking and embodied telepresence (e.g. McGill et al. 2016; Orts-Escolano et al. 2016; Fanello et al. 2016, see Fig. 5) where the remote user is captured and rendered in 3D. To perform said capture of passengers in-car would require the integration of depth camera technology, thus it is likely that more cost-effective approaches will need to be investigated, e.g. portraying avatars on the basis only of what sensing is available on the headset (Hoffman 2016). Secondly, it is also unclear how best to render others at-a-distance in such environments, especially for AR headsets that have to in some way incorporate reality in the presentation. Telepresence literature has often concentrated on room scale spaces, where a mapping between two disparate physical places could be constructed, essentially allowing direct interaction with virtual avatars (Pots 2016; Orts-Escolano et al. 2016; McGill et al. 2016; Fanello et al. 2016). However, when using an AR headset in the constrained environment of a car or plane, such interactions are no longer feasible, as the existing physical space needs to be taken into consideration in the presentation. In such cases, should those at-a-distance be incorporated into the car or plane as other virtual passengers taking up empty seats, or rendered world-in-miniature for viewing by all passengers, etc.? It may even be the case that existing video communications enacted over AR (Kun et al. 2017) are the most appropriate for such constrained spaces.

Fig. 5
figure5

Examples of embodied virtual social experiences. Left: VRChat, a VR social platform with voice chat and customized avatars for lower fidelity, but more broadly accessible, embodied telepresence (VRChat 2018). Middle: a VR-based mixed reality experience where users could see each other captured in real-time and view a synchronized \(360^{\circ }\) experience (McGill et al. 2016); Right: an AR-based mixed reality experience, “Holoportation”, where users could see each other captured in real time (Fanello et al. 2016). Each brings with it different logistical challenges in terms of capture of users and presentation whilst in-transit

We suggest there are open research questions regarding:

  • How to capture and convey passengers? Different forms of transportation will likely be suited towards different sensing technologies for capture.

  • How to render at-a-distance participants? This is of a particular challenge for AR headsets, as any telepresent portrayal would be constrained by the physical environment of the passenger.

Fundamentally, mobile in transit MR should strive to maintain parity with non-mobile MR, given the predicted importance of social interaction at-a-distance for MR headsets. To do so will require an understanding as to how best to capture and convey telepresent users, such that as many of the benefits of face-to-face interactions as possible are retained regardless of mode of transport.

Challenge 7: exploiting vehicle motion and context for presence and immersive experiences

Most affected All forms of transportation. The motion, location and context of the vehicle itself can be instrumented and utilized to potentially create more novel, engaging, immersive experiences. This point notably extends the discussion on motion sickness: where previously motion was being integrated to alleviate sensory mismatch for generic MR content, here the intention is to both alleviate the sensory mismatch whilst more deeply integrating the experience of motion into the virtual experience, taking advantage of sensory alignment (Marshall et al. 2019). The outcome of this could, for example, be turning your ordinary car journey into an exciting 100 KPH space battle, providing a more affective experience by using the vehicle as a motion platform.

Kodama et al. (2017) categorized the use of cars as motion platforms for VR content as being either (a) an active virtual drive system, meaning that the VR user controlled the car, with the experience in VR reflecting that of reality (effectively substitutional reality (Simeone et al. 2015)); (b) a passive virtual drive system, meaning that the motion of the car was integrated into the VR experience, with content limited by the driving route, but the VR user exhibited no control over this (e.g. in autonomous cars/the passenger use case); (c) A content player system, whereby the motion of the car was synchronized with the VR environment.

They examined a content player system, where users had limited control over the car (controlling acceleration over a 7m track) and the content was in synchrony with the motion along one axis (acceleration forward). The VR content was effectively a rollercoaster, and it was noted that, due to the congruent visual and physical motion cues regarding the forward motion, the additional visual cues of the rollercoaster going downhill induced a significant sensation of falling, despite there being no physical vertical motion experienced. This concept of having VR users control their vehicle was extended by Goedicke et al. They adopted a “fused reality” approach where passenger driving actions could be simulated by a real driver as a WOZ prototype (Goedicke et al. 2018), as a means to creating VR driving simulations which retained the “immediacy and rich sensations of on-road driving”. These papers help to illustrate the breadth of ways in which the relationship between the MR-wearing passenger and vehicle can be exploited.

Resolving this challenge

In reviewing vehicular MR papers, we identified eight factors that could be described as using the vehicle to positively affect immersion:

  • Control over motion Ranging from active (full MR user control) to passive (no control, MR user is passenger). As noted by Kodama et al. (2017), this is perhaps the most problematic factor, as there are clear safety concerns regarding how much control the user is given over the experience. However, envisioning a fully autonomous car where the virtual experience or user has some measure of control over acceleration (e.g. varying between 80km/h and 120 km/h on a motorway) could be feasible in certain circumstances (e.g. a clear road), as demonstrated by Goedicke et al. (2018).

  • Foreknowledge/synchrony Both of the route and the impending vehicle actions, building upon the previously discussed benefits of anticipatory movements and actions (Sawabe et al. 2017) towards potentially increased presence, e.g. seeing and feeling your virtual experience follow the same path (Goedicke et al. 2018). Used particularly by Paredes et al. to create a calming, mindful VR experience in-car (Paredes et al. 2018).

  • Context Knowledge/sensing of the specific real-world location/context, e.g. rendering overlays on other cars in AR experiences (Dent 2017), integrating elements of the real world, such as landmarks into the passenger experience (Baldwin et al. 2017), or reacting to detected events (Rober et al. 2018) such as a red light being conveyed as a temporary virtual wall.

  • Motion profile Different vehicles can have very different motion profiles. Compare a cross-country train with long periods of relatively constant velocity followed by long, steady changes in acceleration, versus cruise liners or yachts where the acceleration profiles will be modest; however, oscillations and orientation changes are much more prevalent (and at very different frequencies).

  • Conveyance of motion Movements could be conveyed in terms of changes in acceleration, or absolute velocity. Each would have implications for the kind of MR experience being presented, e.g. on the rails virtual journeys versus seemingly stationary experiences with additional visual cues of motion (e.g. particles moving around the user, displays moving back and forth based on accelerations (Hanau and Popescu 2017), etc.).

  • Magnitude of motion The transfer function between real-world motion and MR-rendered motion has been shown to be able to be varied significantly in roomscale VR (Wilson et al. 2018a) without impacting simulator sickness. Translational and rotational gain could potentially be manipulated to enable more exciting (e.g. conveying 30km/h in reality as 100km/h in VR, or conveying a modest acceleration as a significant one) or calming (conveying accelerations as if gentle movements of a rocking chair) virtual motion experiences.

  • Environmental control Consider concept cars where temperature, air flow and even odour (Dent 2017) can be controlled by the virtual experience to increase the user’s sense of presence in a virtual experience.

  • Anchors/rest frames Rest frames have been noted to be helpful in preventing motion sickness onset (Hock et al. 2017; LaViola 2000), and such visual anchors could be exploited to convey very different environments, e.g. the cockpit of a spaceship, exploiting substitutional reality to render virtual elements that are physically congruent with the vehicle interior (Simeone et al. 2015).

There have been few other concrete examples of exploiting motion for immersion thus far. With respect to VR headsets, Soyka et al. simulated a flight experience on a virtual magic carpet ride, with the intention that airline passengers would experience their journey across a virtual landscape with unrestricted views (Soyka et al. 2015), whilst Hock et al. presented the movements of a car in a virtual cockpit of a helicopter flying over a pre-generated virtual landscape based on a predetermined route for the car (Hock et al. 2017) (Fig. 6). In effect, the car journey was gamified, turned into a first-person virtual helicopter shooter, with passengers flying around a new and different landscape during their journey. Hock et al. found that the kinaesthetic forces perceived by users increased enjoyment and immersion, whilst reducing simulator sickness. Similarly, there exists a number of consumer rollercoaster rides where the virtual experience is tightly linked to the physical motion perceived, to varying degrees of success (Ion 2016). This congruence of visual and physical perception of motion delays or prevents the onset of motion sickness. Moreover, Hock et al. noted that the virtual portrayal of car motion resulted in some participants completely losing awareness of where they were, as well as distorting their awareness of the passage of time. This suggests that occluding reality can in part aid the passing of time in transit, a notable potential benefit for long-haul flights for example. Commercially, Apple have submitted a patent regarding congruent in-car experiences (Rober et al. 2018), whilst Renault and Ubisoft have demonstrated a VR concept car (“Symbioz”) which exploited foreknowledge and context to render motion:

A minute ago I was on a real road, but now I’m rolling down a fake forested highway in a simulation created by Ubisoft. Meanwhile, Renault’s Level 4 autonomous system has taken the piloting chores...It’s a bizarre experience, but I don’t feel sick, because the Symbioz is transmitting real road motion to the headset...I even see simulated versions of the cars and trucks on the road fed in by LiDAR and other sensors. (Dent 2017)

With respect to AR, whilst there are a number of discussions regarding AR use in-car for aiding driving and navigation, there are few examples of AR headset usage by passengers currently. However, discussions of AR windshields and windows (Rao et al. 2014a; Häkkilä et al. 2014; Haeuslschmid et al. 2016; Rao et al. 2014b) hint at potential applications, for example augmented annotations of locations and landmarks (Baldwin et al. 2017; Large et al. 2017) and identifying points of interest for tourism.

Fig. 6
figure6

From left to right: (1) COMS-VR system, where VR users had control over vehicle acceleration (Kodama et al. 2017; 2) VR-OOM driving simulator in-car (Goedicke et al. 2018; 3) Turbulence from a simulated flight experienced on a virtual magic carpet ride, from the VR HYPERSPACE project (Soyka et al. 2015; 4) A virtual helicopter game where movements in-game match the forces experienced during a real-world car journey, from the CarVR project (Hock et al. 2017)

Broadly, what these prototypes demonstrate is that the motion and location of a vehicle can be integrated into MR experiences in a variety of potentially engaging and affective ways. Virtual gamification of vehicle motion and using AR as a virtual tour guide are two examples that have the potential to fuel adoption of MR headset usage in vehicles whilst retaining a direct link between what is visually and physically perceived, the most immediately accessible use case of MR headsets in transit when considering motion sickness. Moreover, such experiences are unique to transit because they require some congruent element of reality to be incorporated, providing additional value to MR headset use beyond standard entertainment and productivity. However, a breadth of further research will be required to understand the design parameters for each of the factors identified.

Discussion

MR headsets have the potential to significantly improve the passenger experience. However, there are a number of impediments to the adoption and usage of MR headsets when in transit. We have discussed three functional impediments, identified through a combination of our initial research into this domain and a review of the emerging literature, being challenges that would prevent or limit normal usage of MR headsets in transit:

  1. 1.

    Maintaining a forward bearing in a moving world

  2. 2.

    Crash safety

  3. 3.

    Motion sickness resultant from discongruent or conflicting visual and physical motion cues

These challenges need to be addressed to provide a foundation upon which VR and AR experiences can be built that function correctly, have access to the necessary vehicle telemetry and do not endanger the passenger. We suggest that these challenges will require cooperation between academia and industrial partners across the domains of transportation, augmented reality and virtual reality.

We have also discussed four significant impediments to the adoption and acceptance of MR headsets. By this, we mean “softer” problems that need to be understood to provide compelling experiences that would justify the adoption of MR headsets in transit, being:

  1. 4.

    Enabling socially acceptable use in shared transit

  2. 5.

    Facilitating interaction in constrained spaces

  3. 6.

    Supporting social at-a-distance mixed reality experiences

  4. 7.

    Exploiting vehicle motion/context for more engaging and affective passenger experiences

These challenges are not critical to the basic function of headsets in-motion. However, they should be investigated to arrive at MR headsets that can feasibly support adoption in a wide range of use cases and provide parity with non-mobile usage. Indeed, parity is perhaps the most important point: adoption of MR headsets in the home and office will lead to user expectations that such usage will be able to continue unabated in transit. Accordingly, users should be provided with as close to the same capabilities when in transit as possible, such that they might perform the same actions, view the same content and communicate in the same way at-a-distance.

However, it is important to note that the need to address many of these challenges is predicated on MR headsets falling into favour with consumers. VR headset adoption has recently stumbled, with the suggestion being that the cost, limitations in terms of fidelity, and a lack of compelling experiences and use cases have led to some consumer apathy, with suggestions that VR is in the “disappointment phase” of the consumer hype cycle (Skarredghost 2018). Microsoft, for example, suggested that immersive VR headsets “did not meet, in general, the high expectations that were set for them” (Feltham 2019). AR headset adoption is not yet even a feasible possibility, with no compelling consumer-level headsets currently available at the time of writing, although this will inevitably change. Over a long enough time line, it would seem reasonable to suggest that passenger adoption of VR headsets would at least be likely in long-duration journeys in comparatively uncomfortable environments, e.g. economy seating in long-haul flights. The likelihood of AR headset adoption would appear stronger over the coming years, particularly if AR headsets reach a point where they become standard, everyday consumer devices. For both AR and VR, compelling use cases such as productivity and entertainment will help to drive headset adoption only once headsets reach an inflection point across cost, fidelity, sensing capability, interaction, social acceptability and fashion, amongst others. If such a point is reached, it could be expected that demand for passenger usage would soon follow.

As recognized by both car manufacturers such as Dent (2017), technology companies such as Apple (Rober et al. 2018) and commercial airlines (Air France 2017; Qantas 2015), the reward for facilitating such usage could be significant. MR headsets have the potential to provide new and varied ways by which travellers can make use of their time in transit, and could provide an additional motivator towards the adoption of transportation such as autonomous cars. Moreover, their adoption could provide a new market of MR headset users and bring with it significant economic benefits for society. However, arriving at this point will require a multidisciplinary research effort involving members of the HCI community, vehicle and headset manufacturers, and others in fields such as systems engineering, sensing and health. This paper serves to outline the challenges involved in facilitating MR headset usage in transit, with the aim of realizing their potential to transform the passenger experience.

Conclusions

This paper has discussed key challenges in supporting passenger use of augmented and virtual reality headsets in transit. MR headsets have a number of advantages over current seatback and mobile displays, e.g. in terms of privacy, field of view and immersion. However, passengers are not yet in a position to fully realize the benefits of using headsets in transit as there are significant barriers to their usage. These barriers range from impediments that would entirely prohibit their usage (e.g. unknown crashworthiness, motion sickness) to those that might prohibit their adoption (e.g. social acceptability and a lack of parity between everyday usage and the capabilities of headsets in transit). We identified and discussed seven key challenges preventing safe, fully functional usage of MR headsets in transit, with the aim of facilitating more research into this nascent application area, assisting passengers to make better use of their time by making travel more productive and enjoyable.

Notes

  1. 1.

    www.microsoft.com/en-gb/hololens.

  2. 2.

    www.oculus.com/quest/.

  3. 3.

    https://www.vive.com.

  4. 4.

    https://enterprise.vive.com/uk/product/vive-focus/.

  5. 5.

    For an example of this, see youtube.com/watch?v=eBs8biTWuEs

  6. 6.

    www.tesla.com/model3.

References

  1. Ahmad BI, Langdon PM, Godsill SJ, Hardy R, Skrypchuk L, Donkor R (2015) Touchscreen usability and input performance in vehicles under different road conditions: an evaluative study. In: Proceedings of the 7th international conference on automotive user interfaces and interactive vehicular applications, AutomotiveUI ’15, pp 47–54. ACM, New York, NY, USA. https://doi.org/10.1145/2799250.2799284

  2. Ahmadpour N, Kühne M, Robert JM, Vink P (2016) Attitudes towards personal and shared space during the flight. IOS Press, pp 981–987. https://doi.org/10.3233/WOR-162346. http://www.medra.org/servlet/aliasResolver?alias=iospress&doi=10.3233/WOR-162346

  3. Ahmadpour N, Lindgaard G, Robert JM, Pownall B (2014) The thematic structure of passenger comfort experience and its relationship to the context features in the aircraft cabin. Taylor & Francis, pp 801–815. https://doi.org/10.1080/00140139.2014.899632. http://www.tandfonline.com/doi/abs/10.1080/00140139.2014.899632

  4. Air France (2017) Immersive headsets on board Air France flights. http://corporate.airfrance.com/en/news/immersive-headsets-board-air-france-flights

  5. Antonov M (2015) Asynchronous Timewarp examined. https://developer.oculus.com/blog/asynchronous-timewarp-examined/

  6. Arshad Q, Cerchiai N, Goga U, Nigmatullina Y, Roberts RE, Casani AP, Golding JF, Gresty MA, Bronstein AM (2015) Electrocortical therapy for motion sickness. American academy of neurology, pp 1257–1259. https://doi.org/10.1212/WNL.0000000000001989. http://www.ncbi.nlm.nih.gov/pubmed/26341870

  7. Baldwin A, Eriksson J, Olsson CM (2017) Bus runner: using contextual cues for procedural generation of game content on public transport. In: International conference on human–computer interaction. Springer, pp 21–34

  8. Baseel C (2014) Japanese people least likely to talk to strangers or offer help on airplanes, survey finds. https://japantoday.com/category/features/lifestyle/japanese-people-least-likely-to-talk-to-strangers-or-offer-help-on-airplanes-survey-finds

  9. Bertolini G, Straumann D (2016) Moving in a moving world: a review on vestibular motion sickness, p 14. https://doi.org/10.3389/fneur.2016.00014. http://journal.frontiersin.org/article/10.3389/fneur.2016.00014

  10. Bles W, Bos JE, de Graaf B, Groen E, Wertheim AH (1998) Motion sickness: only one provocative conflict? Brain Res Bull 47:481–487. https://doi.org/10.1016/S0361-9230(98)00115-4

  11. Boland D, McGill M (2015) Lost in the rift. ACM, pp 40–45. https://doi.org/10.1145/2810046. http://dl.acm.org/ft_gateway.cfm?id=2810046&type=html

  12. Bose: wearables by Bose—AR audio sunglasses (2019). https://www.bose.co.uk/en_gb/products/frames.html

  13. Buckley S (2015) This is how valve’s amazing lighthouse tracking technology works. http://gizmodo.com/this-is-how-valve-s-amazing-lighthouse-tracking-technol-1705356768

  14. Burnett G, Crundall E, Large D, Lawson G, Skrypchuk L (2013) A study of unidirectional swipe gestures on in-vehicle touch screens. In: Proceedings of the 5th international conference on automotive user interfaces and interactive vehicular applications, AutomotiveUI ’13. ACM, New York, NY, pp 22–29. https://doi.org/10.1145/2516540.2516545

  15. Cappitelli M, Group A, D’cruz M (2014) Final advisory board annual report. http://www.vr-hyperspace.eu

  16. Carter L, Paroz AWL, Potter LE (2018) Observations and opportunities for deploying virtual reality for passenger boats. In: Extended abstracts of the 2018 CHI conference on human factors in computing systems—CHI ’18, pp 1–6. ACM Press, New York, New York, USA. https://doi.org/10.1145/3170427.3188615. http://dl.acm.org/citation.cfm?doid=3170427.3188615

  17. Carter L, Paroz AWL, Potter LE (2018) Observations and opportunities for deploying virtual reality for passenger boats. In: Extended abstracts of the 2018 CHI conference on human factors in computing systems, CHI EA ’18, pp LBW118:1–LBW118:6. ACM, New York, NY, USA. https://doi.org/10.1145/3170427.3188615

  18. Cevette MJ, Stepanek J, Cocco D, Galea AM, Pradhan GN, Wagner LS, Oakley SR, Smith BE, Zapala DA, Brookler KH (2012) Oculo-vestibular recoupling using galvanic vestibular stimulation to mitigate simulator sickness. pp 549–555. https://doi.org/10.3357/ASEM.3239.2012. http://www.ingentaconnect.com/content/asma/asem/2012/00000083/00000006/art00002

  19. Chan LW, Kao HS, Chen MY, Lee MS, Hsu J, Hung YP (2010) Touching the void: direct-touch interaction for intangible displays. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’10, pp 2625–2634. ACM, New York, NY, USA. https://doi.org/10.1145/1753326.1753725

  20. Chittaro L, Corbett CL, McLean G, Zangrando N (2018) Safety knowledge transfer through mobile virtual reality: a study of aviation life preserver donning. Elsevier, pp 159–168. https://doi.org/10.1016/J.SSCI.2017.10.012

  21. Cummings JJ, Bailenson JN, Fidler MJ (2015) How immersive is enough? A meta-analysis of the effect of immersive technology on user presence. Routledge, pp 1–38. https://doi.org/10.1.1.363.6971. http://www.tandfonline.com/doi/abs/10.1080/15213269.2015.1015740

  22. Davis S, Nesbitt K, Nalivaiko E (2014) A systematic review of cybersickness. In: Proceedings of the 2014 conference on interactive entertainment—IE2014. ACM Press, New York, New York, USA, pp 1–9. https://doi.org/10.1145/2677758.2677780. http://dl.acm.org/citation.cfm?id=2677758.2677780

  23. Dent S (2017) Renault’s concept EV drove me at 80MPH while I wore a VR headset. https://www.engadget.com/2017/12/13/renault-symbioz-concept-ev-vr-impressions/

  24. Department for Transport: National Travel Survey (2016). https://www.gov.uk/government/statistics/national-travel-survey-2015

  25. Diels C (2008) Visually induced motion sickness. Ph.D. thesis. https://dspace.lboro.ac.uk/2134/13442

  26. Diels C (2014) Will autonomous vehicles make us sick. In: Contemporary ergonomics and human factors. Boca Raton, FL: CRC Press, pp 301–307

  27. Diels C, Bos JE (2015) User interface considerations to prevent self-driving carsickness. In: Adjunct proceedings of the 7th international conference on automotive user interfaces and interactive vehicular applications, AutomotiveUI ’15. ACM, New York, NY, USA, pp 14–19. https://doi.org/10.1145/2809730.2809754

  28. Diels C, Bos JE (2016) Self-driving carsickness. pp 374 – 382. https://doi.org/10.1016/j.apergo.2015.09.009. Transport in the 21st Century: the application of human factors to future user needs

  29. Diels C, Bos JE, Hottelart K, Reilhac P (2016) Motion sickness in automated vehicles: the elephant in the room. Springer International Publishing, Cham, pp 121–129. https://doi.org/10.1007/978-3-319-40503-2_10

  30. Durbin J The oculus acquisition may cost Facebook \$3 billion, not \$2.3 billion. https://uploadvr.com/oculus-acquisition-3-billion/

  31. Durrant-Whyte H, Bailey T (2006) Simultaneous localization and mapping: part I:99–110. https://doi.org/10.1109/MRA.2006.1638022. http://ieeexplore.ieee.org/document/1638022/

  32. Elbamby MS, Perfecto C, Bennis M, Doppler K (2018) Toward low-latency and ultra-reliable virtual reality, pp 78–84. https://doi.org/10.1109/MNET.2018.1700268. http://ieeexplore.ieee.org/document/8329628/

  33. Elbanhawi M, Simic M, Jazar R (2015) In the passenger seat: investigating ride comfort measures in autonomous cars, pp 4–17. https://doi.org/10.1109/MITS.2015.2405571

  34. Fanello S, Rhemann SOeC, Dou M, Tankovich V, Loop C, Chou P (2016) Holoportation: virtual 3D teleportation in real-time. In: Proceedings of the 29th annual symposium on user interface software and technology (UIST ’16), pp 741–754. https://doi.org/10.1145/2984511.2984517

  35. Farias Zuniga AM, Côté JN (2017) Effects of dual monitor computer work versus laptop work on cervical muscular and proprioceptive characteristics of males and females. SAGE Publications, pp 546–563. https://doi.org/10.1177/0018720816684690. http://journals.sagepub.com/doi/10.1177/0018720816684690

  36. Feltham J (2015) Palmer Luckey Explains Oculus Rift’s constellation tracking and fabric. https://www.vrfocus.com/2015/06/palmer-luckey-explains-oculus-rifts-constellation-tracking-and-fabric/

  37. Feltham J (2019) Microsoft: VR headsets ’Didn’t Meet High Expecations’. https://uploadvr.com/windows-vr-expectations/

  38. Frangakis N, Karaseitanidis G, D’Cruz M, Patel H, Mohler B, Bues M, Helin K (2014) Research Roadmap. http://www.vr-hyperspace.eu

  39. Frisson C, Julien D, Pietrzak T, Ng A, Poncet P, Casset F, Latour A, Brewster S (2017) Designing vibrotactile widgets with printed actuators and sensors. In: Adjunct proceedings of the 2017 ACM symposium on user interface software and technology (UIST), UIST ’17. ACM

  40. Gabbard JL, Fitch GM, Kim H (2014) Behind the glass: driver challenges and opportunities for ar automotive applications, pp 124–136. IEEE

  41. Gardner B, Abraham C (2007) What drives car use? a grounded theory analysis of commuters’ reasons for driving, pp 187–200. https://doi.org/10.1016/j.trf.2006.09.004

  42. Gekhman D (2006) Mass of a Human Head . http://hypertextbook.com/facts/2006/DmitriyGekhman.shtml

  43. Goedicke D, Li J, Evers V, Ju W (2018) VR-OOM: virtual reality on-road driving simulation . In: Proceedings of the 2018 CHI conference on human factors in computing systems—CHI ’18, pp 1–11. ACM Press, New York, New York, USA . https://doi.org/10.1145/3173574.3173739. http://dl.acm.org/citation.cfm?doid=3173574.3173739

  44. Golding JF, Gresty MA (2015) Pathophysiology and treatment of motion sickness, pp 83–8. https://doi.org/10.1097/WCO.0000000000000163

  45. Groening S (2013) Aerial screens. Routledge, pp 284–303. https://doi.org/10.1080/07341512.2013.858523. http://www.tandfonline.com/doi/abs/10.1080/07341512.2013.858523

  46. Groening S (2016) ‘No One Likes to Be a Captive Audience’: Headphones and In-Flight Cinema. https://muse.jhu.edu/article/640056/summary

  47. Guedry FE, Benson AJ (1978) Coriolis cross-coupling effects: disorienting and nauseogenic or not? pp 29–35. http://www.ncbi.nlm.nih.gov/pubmed/304719

  48. Gulliver: virtual-reality headsets on planes mean we can isolate ourselves from irritating cabin-mates (2017). https://www.economist.com/blogs/gulliver/2017/01/flying-solo-together

  49. Haeuslschmid R, Pfleging B, Alt F (2016) A design space to support the development of windshield applications for the car. In: Proceedings of the 2016 CHI conference on human factors in computing systems, CHI ’16, pp 5076–5091. ACM, New York, NY, USA. https://doi.org/10.1145/2858036.2858336

  50. Häkkilä J, Colley A, Rantakari J (2014) Exploring mixed reality window concept for car passengers. In: Adjunct proceedings of the 6th international conference on automotive user interfaces and interactive vehicular applications, AutomotiveUI ’14, pp 1–4. ACM, New York, NY, USA. https://doi.org/10.1145/2667239.2667288

  51. Hanau E, Popescu V (2017) Motionreader: visual acceleration cues for alleviating passenger e-reader motion sickness. In: Proceedings of the 9th international conference on automotive user interfaces and interactive vehicular applications adjunct, AutomotiveUI ’17, pp 72–76. ACM, New York, NY, USA . https://doi.org/10.1145/3131726.3131741

  52. Häuslschmid R, von Bülow M, Pfleging B, Butz A (2017) Supporting trust in autonomous driving. In: Proceedings of the 22Nd international conference on intelligent user interfaces, IUI ’17, pp 319–329. ACM, New York, NY, USA. https://doi.org/10.1145/3025171.3025198

  53. Häuslschmid R, Osterwald S, Lang M, Butz A (2015) Augmenting the driver’s view with peripheral information on a windshield display. In: Proceedings of the 20th international conference on intelligent user interfaces—IUI ’15, pp 311–321. ACM Press, New York, New York, USA. https://doi.org/10.1145/2678025.2701393. http://dl.acm.org/citation.cfm?doid=2678025.2701393

  54. Heathrow: facts and figures (2017). https://www.heathrow.com/company/company-news-and-information/company-information/facts-and-figures

  55. Hecht T, Feldhütter A, Draeger K, Bengler K (2020) What do you do? An analysis of non-driving related activities during a 60 minutes conditionally automated highway drive. In: Springer, pp 28–34. https://doi.org/10.1007/978-3-030-25629-6_5. https://link.springer.com/chapter/10.1007/978-3-030-25629-6_5

  56. Hock P, Benedikter S, Gugenheimer J, Rukzio E (2017) Carvr: enabling in-car virtual reality entertainment. In: Proceedings of the 2017 CHI conference on human factors in computing systems, CHI ’17, pp 4034–4044. ACM, New York, NY, USA. https://doi.org/10.1145/3025453.3025665

  57. Hoffman M (2016) Remote collaboration with multiple avatars. In: Microsoft build developer conference. https://vimeo.com/160704056

  58. Holly R (2017) Using VR on an airplane is surprisingly enjoyable with the right apps! https://www.vrheads.com/using-vr-airplane-surprisingly-enjoyable-right-apps

  59. Hong S, Kim GJ (2016) Accelerated viewpoint panning with rotational gain in 360 degree videos. In: Proceedings of the 22nd ACM conference on virtual reality software and technology—VRST ’16, pp 303–304. ACM Press, New York, New York, USA. https://doi.org/10.1145/2993369.2996309. http://dl.acm.org/citation.cfm?doid=2993369.2996309

  60. Hosseini M, Farahani S (2015) Vestibular findings in motion sickness. http://avr.tums.ac.ir/index.php/avr/article/view/10

  61. International S (2016) Taxonomy and definitions for terms related to driving automation systems for on-road motor vehicles

  62. Ion F (2016) Too sick to stand: What it’s like to ride the first VR video game roller coaster. https://www.vrheads.com/too-sick-stand-ride-first-ever-vr-video-game-roller-coaster

  63. Karjanto J, Md. Yusof N, Wang C, Terken J, Delbressine F, Rauterberg M (2018) The effect of peripheral visual feedforward system in enhancing situation awareness and mitigating motion sickness in fully automated driving, pp 678–692. Pergamon. https://doi.org/10.1016/J.TRF.2018.06.046

  64. Kodama R, Koge M, Taguchi S, Kajimoto H (2017) COMS-VR: mobile virtual reality entertainment system using electric car and head-mounted display. In: 2017 IEEE symposium on 3D user interfaces (3DUI), pp 130–133. IEEE . https://doi.org/10.1109/3DUI.2017.7893329. http://ieeexplore.ieee.org/document/7893329/

  65. Koisaari T, Leivo T, Sahraravand A, Haavisto AK, Sulander P, Tervo TMT (2017) Airbag deployment—related eye injuries. pp 1–7. Taylor & Francis. https://doi.org/10.1080/15389588.2016.1271945. https://www.tandfonline.com/doi/full/10.1080/15389588.2016.1271945

  66. Kuchera B (2015) I’m the creepy guy wearing a VR headset on your plane (and it’s great). https://www.polygon.com/2015/3/27/8302453/im-the-creepy-guy-wearing-a-vr-headset-on-your-plane-and-its-great

  67. Kuiper OX, Bos JE, Diels C (2018) Looking forward: in-vehicle auxiliary display positioning affects carsickness. Elsevier, pp 169–175. https://doi.org/10.1016/J.APERGO.2017.11.002

  68. Kun AL, Boll S, Schmidt A (2016) Shifting gears: user interfaces in the age of autonomous driving, pp 32–38. IEEE

  69. Kun AL, van der Meulen H, Janssen CP (2017) Calling while driving: an initial experiment with hololens. In: Proceedings of the 9th international driving symposium on human factors in driver assessment, training and vehicle design

  70. Large DR, Burnett G, Bolton A (2017) Augmenting landmarks during the head-up provision of in-vehicle navigation advice, pp 18–38. IGI Global. https://doi.org/10.4018/IJMHCI.2017040102. http://services.igi-global.com/resolvedoi/resolve.aspx?doi=10.4018/IJMHCI.2017040102

  71. LaValle SM, Yershova A, Katsev M, Antonov M (2014) Head tracking for the oculus rift. In: 2014 IEEE international conference on robotics and automation (ICRA), pp 187–194. https://doi.org/10.1109/ICRA.2014.6906608

  72. LaViola Jr, JJ (2000) A discussion of cybersickness in virtual environments. ACM, New York, NY, pp 47–56. https://doi.org/10.1145/333329.333344

  73. Lewis L, Patel H, Cobb S, D’Cruz M, Bues M, Stefani O, Grobler T (2016) Distracting people from sources of discomfort in a simulated aircraft environment, pp 963–979. https://doi.org/10.3233/WOR-162356. http://eprints.nottingham.ac.uk/36254/1/VEsto distract people from sources of discomfort_v2_13_07_15.pdf

  74. Lewis L, Patel H, D’Cruz M, Cobb S (2017) What makes a space invader? Passenger perceptions of personal space invasion in aircraft travel. Taylor & Francis, pp 1–10. https://doi.org/10.1080/00140139.2017.1313456. https://www.tandfonline.com/doi/full/10.1080/00140139.2017.1313456

  75. Lucero A, Vetek A (2014) Notifeye: using interactive glasses to deal with notifications while walking in public. In: Proceedings of the 11th conference on advances in computer entertainment technology, ACE ’14, pp 17:1–17:10. ACM, New York, NY, USA. https://doi.org/10.1145/2663806.2663824

  76. Mangiante S, Klas G, Navon A, GuanHua Z, Ran J, Silva MD (2017) VR is on the Edge: how to deliver 360 videos in mobile networks, pp 30–35. https://doi.org/10.1145/3097895.3097901. https://dl.acm.org/citation.cfm?id=3097901http://dl.acm.org/citation.cfm?doid=3097895.3097901

  77. Marshall J, Benford S, Byrne R, Tennent P (2019) Sensory alignment in immersive entertainment. In: Proceedings of the 2019 CHI conference on human factors in computing systems—CHI ’19. ACM Press, New York, New York, , pp 1–13. https://doi.org/10.1145/3290605.3300930. http://dl.acm.org/citation.cfm?doid=3290605.3300930

  78. Marshall J, Dancu A, Mueller FF (2016) Interaction in motion: designing truly mobile interaction. In: Proceedings of the 2016 ACM conference on designing interactive systems, DIS ’16. ACM, New York, NY, USA, pp 215–228. https://doi.org/10.1145/2901790.2901844

  79. Matt Kamen: Ford patents windshield movie screen for driverless cars (2016). http://www.wired.co.uk/article/ford-patents-movie-window-for-driverless-cars

  80. McGill M, Boland D, Murray-Smith R, Brewster S (2015) A dose of reality: overcoming usability challenges in VR head-mounted displays. In: Proceedings of the 33rd annual ACM conference on human factors in computing systems—CHI ’15, pp 2143–2152. ACM Press, New York, New York, USA. https://doi.org/10.1145/2702123.2702382. http://dl.acm.org/citation.cfm?id=2702123.2702382

  81. McGill M, Ng A, Brewster S (2017) I am the passenger: how visual motion cues can influence sickness for in-car VR. In: Proceedings of the 2017 CHI conference on human factors in computing systems, CHI ’17. ACM, New York, NY, USA, pp 5655–5668. https://doi.org/10.1145/3025453.3026046

  82. McGill M, Williamson JH, Brewster S (2016) Examining the role of smart TVs and VR HMDs in synchronous at-a-distance media consumption, pp 1–57. ACM. https://doi.org/10.1145/2983530. http://dl.acm.org/citation.cfm?doid=3007191.2983530

  83. Mercedes-Benz: F015 Autonomous Concept Car (2016). https://www.mercedes-benz.com/en/mercedes-benz/innovation/research-vehicle-f-015-luxury-in-motion/

  84. Ng A, Brewster S, Beruscha F, Krautter W (2017) An evaluation of input controls for in-car interactions. In: Proceedings of the 2017 CHI conference on human factors in computing systems, CHI ’17. ACM. http://eprints.gla.ac.uk/134441/

  85. Ng A, Brewster SA (2016) Investigating pressure input and haptic feedback for in-car touchscreens and touch surfaces. In: Proceedings of the 8th international conference on automotive user interfaces and interactive vehicular applications—Automotive’UI 16, pp 121–128. ACM Press, New York, New York, USA. https://doi.org/10.1145/3003715.3005420. http://dl.acm.org/citation.cfm?doid=3003715.3005420

  86. Ng A, Brewster SA, Beruscha F, Krautter W (2017) An evaluation of input controls for in-car interactions. In: Proceedings of the 2017 CHI conference on human factors in computing systems, CHI ’17. ACM, New York, NY, USA, pp 2845–2852. https://doi.org/10.1145/3025453.3025736

  87. Office of rail and road: passenger rail usage 2017–18 Q4 Statistical Release. Tech. rep., Office of Rail and Road (2018). http://orr.gov.uk/__data/assets/pdf_file/0014/28013/passenger-rail-usage-2017-18-q4.pdf

  88. Orlosky J, Kiyokawa K, Takemura H (2017) Virtual and augmented reality on the 5G highway, pp 133–141. https://doi.org/10.2197/ipsjjip.25.133. https://www.jstage.jst.go.jp/article/ipsjjip/25/0/25_133/_article/-char/ja/https://www.jstage.jst.go.jp/article/ipsjjip/25/0/25_133/_article

  89. Orts-Escolano S, Rhemann C, Fanello S, Chang W, Kowdle A, Degtyarev Y, Kim D, Davidson PL, Khamis S, Dou M, Tankovich V, Loop C, Cai Q, Chou PA, Mennicken S, Valentin J, Pradeep V, Wang S, Kang SB, Kohli P, Lutchyn Y, Keskin C, Izadi S (2016) Holoportation: virtual 3d teleportation in real-time. In: Proceedings of the 29th annual symposium on user interface software and technology, UIST ’16, pp 741–754. ACM, New York, NY, USA. https://doi.org/10.1145/2984511.2984517

  90. Owen N, Leadbetter AG, Yardley L (1998) Relationship between postural control and motion sickness in healthy subjects, pp 471–474. https://doi.org/10.1016/S0361-9230(98)00101-4

  91. Paredes PE, Balters S, Qian K, Murnane EL, Ordóñez F, Ju W, Landay JA (2018) Driving with the fishes: towards calming and mindful virtual reality experiences for the car, pp 1–21. ACM. https://doi.org/10.1145/3287062. http://dl.acm.org/citation.cfm?doid=3301777.3287062

  92. Patel H, D’Cruz M (2017) Passenger-centric factors influencing the experience of aircraft comfort. Routledge, pp 1–18. https://doi.org/10.1080/01441647.2017.1307877. https://www.tandfonline.com/doi/full/10.1080/01441647.2017.1307877

  93. Pauzie A (2015) Head up display in automotive: a new reality for the driver, pp 505–516. Springer International Publishing, Cham. https://doi.org/10.1007/978-3-319-20889-3_47

  94. Pots J (2016) Collaborating with holograms: could ‘mixed reality‘ be the future of telecommuting?. https://www.digitaltrends.com/virtual-reality/hololens-mixed-reality-work-tool-object-theory/

  95. Press association: millions of people spend two or more hours commuting a day (2015). https://www.theguardian.com/money/2015/nov/09/million-people-two-hours-commuting-tuc-study

  96. Prince Ayiez: In flight entertainment: does the airlines selection of IFE impact passengers preference or mere investment? (2014). https://www.slideshare.net/eyielurvedye/airlines-inflight-entertainment

  97. Qantas: Qantas & samsung unveil industry-first virtual reality experience for travellers (2015). https://www.qantasnewsroom.com.au/media-releases/qantas-samsung-unveil-industry-first-virtual-reality-experience-for-travellers/

  98. Rao Q, Grünler C, Hammori M, Chakraborty S (2014) Design methods for augmented reality in-vehicle infotainment systems. In: Proceedings of the 51st annual design automation conference, DAC ’14, pp. 72:1–72:6. ACM, New York, NY, USA. https://doi.org/10.1145/2593069.2602973

  99. Rao Q, Tropper T, Grünler C, Hammori M, Chakraborty S (2014) Ar-ivi 2014; implementation of in-vehicle augmented reality. In: 2014 IEEE international symposium on mixed and augmented reality (ISMAR), pp 3–8. https://doi.org/10.1109/ISMAR.2014.6948402

  100. Reason JT, Brand JJ (1975) Motion sickness. Academic Press, London

  101. Redlick FP, Jenkin M, Harris LR (2001) Humans can use optic flow to estimate distance of travel, pp 213–219. Pergamon. https://doi.org/10.1016/S0042-6989(00)00243-1

  102. Riccio G, Stoffregen T (1991) An ecological theory of motion sickness and postural instability. Routledge, pp 195–240. https://doi.org/10.1207/s15326969eco0303_2

  103. Rico J, Brewster S (2010) Usable gestures for mobile interfaces: evaluating social acceptability. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’10. ACM, New York, NY, USA, pp 887–896. https://doi.org/10.1145/1753326.1753458

  104. Rober M, others (2018) Immersive virtual display . http://pdfaiw.uspto.gov/.aiw?PageNum=0&docid=20180089901. US Patent Application 2018/0089901

  105. Rolnick A, Lubow RE (1991) Why is the driver rarely motion sick? The role of controllability in motion sickness. Taylor & Francis Group, pp 867–879. https://doi.org/10.1080/00140139108964831. http://www.tandfonline.com/doi/abs/10.1080/00140139108964831

  106. Russell M, Price R, Signal L, Stanley J, Gerring Z, Cumming J (2011) What do passengers do during travel time? Structured observations on buses and trains, pp 123–146. https://doi.org/10.5038/2375-0901.14.3.7. https://scholarcommons.usf.edu/jpt/vol14/iss3/7/

  107. Sawabe T, Kanbara M, Hagita N (2017) Diminished reality for acceleration stimulus: Motion sickness reduction with vection for autonomous driving. In: 2017 IEEE virtual reality (VR). IEEE, pp 277–278. https://doi.org/10.1109/VR.2017.7892284. http://ieeexplore.ieee.org/document/7892284/

  108. Shakeri G, Ng A, Williamson JH, Brewster SA (2016) Evaluation of haptic patterns on a steering wheel. In: Proceedings of the 8th international conference on automotive user interfaces and interactive vehicular applications, Automotive’UI 16. ACM, New York, NY, USA, pp 129–136. https://doi.org/10.1145/3003715.3005417

  109. Simeone AL, Velloso E, Gellersen H (2015) Substitutional reality: Using the physical environment to design virtual reality experiences. In: Proceedings of the 33rd annual ACM conference on human factors in computing systems, CHI ’15. ACM, New York, NY, USA, pp 3307–3316. https://doi.org/10.1145/2702123.2702389

  110. Sivak M, Schoettle B (2015) Motion sickness in self-driving vehicles. http://deepblue.lib.umich.edu/handle/2027.42/111747

  111. Skarredghost: all you need to know about SteamVR tracking 2.0 (2017). https://skarredghost.com/2017/06/07/need-know-steamvr-tracking-2-0-will-foundation-vive-2/

  112. Skarredghost: Virtual Reality is reaching a mature state according to gartner—the ghost howls (2018). https://skarredghost.com/2018/08/27/virtual-reality-is-reaching-a-mature-state-according-to-gartner/

  113. Slater M (2009) Place illusion and plausibility can lead to realistic behaviour in immersive virtual environments, pp. 3549–3557. The Royal Society. https://doi.org/10.1098/rstb.2009.0138. http://rstb.royalsocietypublishing.org/content/364/1535/3549

  114. Smith M (2016) Londoners are the most embarassed by talking to strangers. https://yougov.co.uk/topics/politics/articles-reports/2016/10/03/londoners-are-least-pleased-prospect-talking-stran

  115. Soyka F, Kokkinara E, Leyrer M, Buelthoff H, Slater M, Mohler B (2015) Turbulent motions cannot shake vr. In: 2015 IEEE virtual reality (VR), pp 33–40. https://doi.org/10.1109/VR.2015.7223321

  116. Stevens AH, Butkiewicz T (2019) Reducing seasickness in onboard marine VR use through visual compensation of vessel motion. In: 2019 IEEE conference on virtual reality and 3D user interfaces (VR), pp 1872–1873. IEEE. https://doi.org/10.1109/VR.2019.8797800. https://ieeexplore.ieee.org/document/8797800/

  117. Studarus L (2018) How the Finnish survive without small talk. http://www.bbc.com/travel/story/20181016-how-the-finnish-survive-without-small-talk

  118. Tervon T, Sulander P (2014) Spectacle wear, airbag deployment and eye trauma. http://iovs.arvojournals.org/article.aspx?articleid=2271072

  119. Thales: InFlyt experience seatback displays (2018). https://www.thalesgroup.com/en/global/activities/aerospace/thales-inflyt-experience

  120. The 360 guy: the ultimate VR headset comparison table: every VR headset compared (2019). https://www.threesixtycameras.com/vr-headset-comparison-table/

  121. Toppan R, Chiesa M (2015) Integrating a Touchless UI in the automotive environment. In: Adjunct proceedings of the 7th international conference on automotive user interfaces and interactive vehicular applications, AutomotiveUI ’15. http://www.auto-ui.org/15/p/workshops/5/toppan.pdf

  122. Toyota Belgium: Window to the world multimedia system (2011). https://www.youtube.com/watch?v=dl9eqdZpvJU

  123. Trades union congress: number of commuters spending more than two hours travelling to and from work up by 72% in last decade, says TUC (2015). https://www.tuc.org.uk/workplace-issues/work-life-balance/number-commuters-spending-more-two-hours-travelling-and-work-72

  124. Tsuda Y, Wakiyama H, Amemiya T (1999) Ocular injury caused by an air bag for a driver wearing eyeglasses, pp 239–40. http://www.ncbi.nlm.nih.gov/pubmed/10413260

  125. TUC: Long commutes up by a third (2016). https://www.tuc.org.uk/news/long-commutes-third-finds-tuc

  126. U.S. Department of transportation: summary of travel trends (2009)

  127. VIVE Blog: introducing the logitech BRIDGE SDK (2018). https://blog.vive.com/us/2017/11/02/introducing-the-logitech-bridge-sdk/

  128. VRChat: VRChat social VR application (2018). https://www.vrchat.net/

  129. Wang S, Song J, Lien J, Poupyrev I, Hilliges O (2016) Interacting with soli: exploring fine-grained dynamic gesture recognition in the radio-frequency spectrum. In: Proceedings of the 29th annual symposium on user interface software and technology, UIST ’16. ACM, New York, NY, USA, pp 851–860. https://doi.org/10.1145/2984511.2984565

  130. Watts L, Urry J (2008) Moving methods, travelling times, pp 860–874. SAGE Publications. https://doi.org/10.1068/d6707. http://epd.sagepub.com/lookup/doi/10.1068/d6707

  131. Wienrich C, Zachoszcz M, Schlippe Mv, Packhäuser R (2017) Pilotstudie: Einsatz von mobilen vr-anwendungen in gleichmäßig und ruhig bewegten transportsystemen. Gesellschaft für Informatik eV

  132. Wilfinger D, Meschtscherjakov A, Murer M, Osswald S, Tscheligi M (2011) Are we there yet? A probing study to inform design for the rear seat of family cars. Springer, Berlin, pp 657–674. https://doi.org/10.1007/978-3-642-23771-3_48. http://link.springer.com/10.1007/978-3-642-23771-3_48

  133. Wiliamson JR, Crossan A, Brewster S (2011) Multimodal mobile interactions: usability studies in real world settings. In: Proceedings of the 13th international conference on multimodal interfaces, ICMI ’11. ACM, New York, NY, USA, pp 361–368. https://doi.org/10.1145/2070481.2070551

  134. Williamson JR, McGill M, Outram K (2019) Planevr: social acceptability of virtual reality for aeroplane passengers. In: Proceedings of the 2019 CHI conference on human factors in computing systems, CHI ’19. ACM, New York, NY, USA, pp 80:1–80:14. https://doi.org/10.1145/3290605.3300310

  135. Wilson G, McGill M, Jamieson M, Williamson JR, Brewster SA (2018) Object manipulation in virtual reality under increasing levels of translational gain. In: Proceedings of the 2018 CHI conference on human factors in computing systems, CHI ’18. ACM, New York, NY, USA, pp 99:1–99:13. https://doi.org/10.1145/3173574.3173673

  136. Wilson G, McGill M, Jamieson M, Williamson JRR, Brewster SA (2018) Object manipulation in virtual reality under increasing levels of translational gain. In: Proceedings of CHI ’18. ACM Press, New York, New York, USA. https://doi.org/10.1145/3173574.3173673. http://dl.acm.org/citation.cfm?doid=3173574.3173673

  137. Zhang LL, Wang JQ, Qi RR, Pan LL, Li M, Cai YL (2016) Motion sickness: current knowledge and recent advance, pp 15–24. https://doi.org/10.1111/cns.12468

Download references

Funding

This research was funded in part by the EPSRC IAA (303740) and ESRC IAA (77563/1) joint project “CarVR: Immersion in the Journey”. This project also received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (Grant Agreement No. 835197 - ViAjeRo).

Author information

Correspondence to Mark McGill.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

McGill, M., Williamson, J., Ng, A. et al. Challenges in passenger use of mixed reality headsets in cars and other transportation. Virtual Reality (2019). https://doi.org/10.1007/s10055-019-00420-x

Download citation

Keywords

  • Virtual reality
  • Augmented reality
  • Mixed reality
  • Transportation
  • Passenger
  • In-car
  • In-flight
  • Travel