Computer Supported Cooperative Work (CSCW)

, Volume 24, Issue 6, pp 515–525 | Cite as

Collaboration in Augmented Reality

  • Stephan Lukosch
  • Mark Billinghurst
  • Leila Alem
  • Kiyoshi Kiyokawa
Open Access


Augmented Reality (AR) is a technology that allows users to view and interact in real time with virtual images seamlessly superimposed over the real world. AR systems can be used to create unique collaborative experiences. For example, co-located users can see shared 3D virtual objects that they interact with, or a user can annotate the live video view of a remote worker, enabling them to collaborate at a distance. The overall goal is to augment the face-to-face collaborative experience, or to enable remote people to feel that they are virtually co-located. In this special issue on collaboration in augmented reality, we begin with the visions of science fiction authors of future technologies that might significantly improve collaboration, then introduce research articles which describe progress towards these visions, finally we outline a research agenda discussing the work still to be done.

Key words

Collaboration  Augmented Reality 

1 Introduction

In the Otherland saga, Tad Williams (1996, 1998, 1999, 2001) describes a future world with a widespread availability of full-immersion Virtual Reality (VR) installations which allow people to access an online world, called simply ‘the Net’. Within the Net, a group of people aim to achieve immortality. In his novel Rainbows End (Vinge 2007) Vernor Vinge describes how the main character, Robert Gu, is slowly recovering from Alzheimer’s disease and adapts to a changed world in which almost every object is networked and the use of Augmented Reality (AR) is normal. Humans experience AR by wearing smart clothes and contact lenses that can overlay the physical environment with computer graphics. In this future world, AR technology is used for various purposes, such as large-scale commercial gaming, supporting maintenance workers with blueprints of machines or buildings, communication with virtual avatars, and medical diagnosis.

AR allow users to see the real world, with virtual objects superimposed upon or composited with their real environment (Azuma 1997). Here the virtual objects are computer graphics that exist in essence or effect, but not formally or actually (Milgram and Kishino 1994). AR systems are not limited to use of head-mounted devices (HMD), but AR technology mainly has to combine real and virtual objects, to be interactive in real-time, and to register virtual objects within the real 3D environment (Azuma 1997).

Williams and Vinge forecast a vision for the future that current research on collaboration in AR is addressing. Several years from now, technology will provide an infrastructure for physical and virtual connectivity just as described in Rainbows End. Everyday objects will be connected and be able to provide and exchange information. Instead of only overlaying the physical environment with computer graphics, future AR systems will move closer to the ultimate display that Ivan Sutherland envisioned as “a room within which the computer can control the existence of matter. A chair displayed in such a room would be good enough to sit in. Handcuffs displayed in such a room would be confining, and a bullet displayed in such a room would be fatal.” (Sutherland 1965). Such a display would allow for holistic embodied experiences addressing all of the human senses, such as sound, smell, taste and touch, as envisioned in Otherland. Early science fiction motion pictures, e.g., Star Wars (Lucas 1977), used holistic experiences to support the interaction and collaboration of geographically distributed persons. When this fiction comes true, the use of and interaction within AR environments will be as natural as in the real world.

First steps towards a combined vision of Williams and Vinge have already been taken. There has been research on introducing smell (Kim et al. 2011) and touch (Samur 2012) into movie theaters and television. However, one of the most difficult aspects to reproduce is realistic interaction with other (real or virtual) humans. Olson and Olson (2000, 2014) analysed the efforts of geographically distributed teams to use technology to create sense of being in one place and work together. They came to the conclusion that distance matters and that current technology is not mature enough to enable virtual co-presence, and even future technology will struggle to enable this. In their opinion, providing awareness among co-workers and enabling co-reference as well as spatial referencing will remain a challenge. Gaver (1991) stresses the importance of supporting awareness information to help actors shifting from working alone to working together.

Considering current groupware technology, this forecast is still correct. Complex problem solving still requires a team of experts to physically meet and interact with each other. The identification of the problem and the creation of a shared understanding are major challenges for efficiently solving a problem (Piirainen et al. 2012). Unfortunately, due to experts’ availability, critical timing issues or accessibility of a location, it is not always possible to bring a team together to handle a complex situation. While in the novel Rainbows End, such situations are supported with AR technology, current AR experiences are not there yet.

This special issue includes four articles that focus on remote collaboration using different types of AR systems. They advance the state of the art and provide insights about how AR can be used for remote guidance, what are critical design factors, and what is the impact on awareness. Most importantly, they show how close current AR systems are to the future Science Fiction visions described above. After reviewing the state of the art of collaboration using AR systems, this introduction builds upon the articles in this special issue and outlines a research agenda for future work on remote collaboration supported by AR systems.

2 Background

Several studies have explored the effectiveness of using AR for complex tasks. For individual users, Baird and Barfield (1999) showed that AR can improve the effectiveness of assembly tasks. Similarly, Tang et al. (2003) showed that the use of AR improves performance and reduces mental effort in assembly tasks. Bauer et al. (1999) show that in a distributed setting a remote user can effectively guide and direct a local user’s activities using an AR telepointer. Wang and Dunston (2011) showed that AR systems can improve performance time and mental effort in collaborative design tasks. Dong et al. (2013) found that AR facilitates communication and discussion of engineering processes. Alem and Li (2011) showed that AR can improve the satisfaction with remote collaboration on physical tasks. Recent results on using AR for collaboration among crime scene investigators, indicates that it supports mutual understanding, leads to consensus and supports hypothesis testing (Poelman et al. 2012).

There are several examples of using AR technology for supporting face-to-face collaboration. The Transvision system (Rekimoto 1996) allows multiple users to share computer-generated graphics on a table. The Collaborative Web Space (Billinghurst and Kato 1999, 2002) allows users in the same location to collaboratively browse the web while seeing the real world and communicate naturally about the visited pages. The Studierstube system (Schmalstieg et al. 2002; Szalavári et al. 1998) targets presentations and education and allows users to walk around shared virtual 3D scientific data superimposed over the real world. EMMIE (Butz et al. 1999) uses AR to connect people and devices by visualizing cross-device interactions, search, and privacy status during a meeting. VITA (Benko et al. 2004) supports multiple users in exploring scaled and full-size representations of an archaeological dig. More recent systems considered the use of AR in dynamic emergency response tasks (Nilsson et al. 2009) or within a board game, in which users play social games with physical game pieces using a handheld on a tabletop (Huynh et al. 2009). Similarly, ARVita (Dong et al. 2013) enables multiple users around a table wearing HMDs to observe and interact with dynamic visual simulations of engineering processes. Some of the key lessons learned from these systems are that users can interact with shared AR content as naturally as with physical objects, AR reduces separation between task space and communication space, and that AR enhances natural face-to-face communication cues.

Examples for AR systems that support remote collaboration can be found in Billinghurst and Kato (1999), Höllerer et al. (1999), Stafford et al. (2006), Minatani et al. (2007), Poelman et al. (2012), Datcu et al. (2014a), or Billinghurst and Thomas (2011). For example, WearCom (Billinghurst and Kato 1999) enables users to see remote collaborators as virtual avatars in real space. Höllerer et al. (1999) allow indoor AR users to visualize the locations and paths of outdoor AR users, and create shared annotations. Stafford et al. (2006) introduce a new interaction metaphor called “god-like interaction” which supports multi-scale interaction between outdoor wearable AR users and indoor users using a tabletop projector, and aims at improving sharing of situational and navigational information between remote users. Minatani et al. (2007) describe a system that enables two distributed users to a play a tabletop game in an AR conferencing scenario. Poelman et al. (2012) introduce and evaluate an AR system that allows remote experts to collaborate with local investigators on a crime scene investigation in order to secure evidence. Along the same line Datcu et al. (2014a), present a system that supports situational awareness of cross-organisational teams in the security domain. Some of the key lessons learned from these systems are that AR technology can reproduce some of the spatial cues used in face-to-face collaboration that are normally lost in remote conferencing systems, AR can increase social presence compared to other technologies, and AR also allows remote collaborators to interact naturally in the local user’s real environment (Billinghurst and Thomas 2011).

In addition to the above systems, collaborative AR has also been used in several industry scenarios such as product design, maintenance or factory planning. The ReMote system (Alem et al. 2011; Huang et al. 2013) is a remote assistance system that allows a remote expert to assist and guide in real time a local maintenance operator in a mine site. The remote expert uses their own hands (hand gestures over a live video of the workspace) as well as annotations on still images to explain how the repair /maintenance task needs to be performed. The MagicMeeting system (Regenbrecht et al. 2002) is a multi-user AR system that allows up to four co-located users to have a design zone review meeting. Fata Morgana (Klinker et al. 2002) supports five different scenarios for car designers where they can rotate a car, perform an overview evaluation, focus on details, discuss with colleagues, and compare designs. Roivis (Pentenrieder et al. 2007) is an AR application that supports factory design and planning between co-located users that share a view. DARCC (Hammad et al. 2009) is a distributed AR application that allows multiple users to simulate collaborative construction tasks, e.g., coordinating crane operations. Gauglitz et al. (2014) introduce a tablet-based system that incorporates a touch screen interface through which a remote user can navigate a physical environment and create world-aligned annotations for supporting maintenance. Oda et al. (2013) present a system for remote equipment maintenance. These systems show that collaborative AR technology could have significant impact in design and industrial applications.

In summary, AR technology is becoming mature enough to support a variety of complex collaboration scenarios (Carmigniani et al. 2011; Piekarski and Thomas 2009). However, there are still a number of open issues that need further research. One major issue with regard to AR collaboration is in relation to the presence of remote users (Billinghurst and Thomas 2011). Local users may feel remote controlled by the remote experts, while the experts may feel that they miss something when not physically being present at the scene (Poelman et al. 2012). It is unknown if the proven concepts of awareness support from traditional desktop collaboration situations (Schümmer and Lukosch 2007) can be transferred to AR interaction spaces, or whether completely new approaches are necessary. Billinghurst and Thomas (2011) also point out that further research on how users can interact within collaborative AR systems is necessary. For example, investigating whether bare hand interaction or interaction with a physical object is more effective in AR (Datcu et al. 2015).

Recent research has shown that virtual co-location is in fact possible. Poelman et al. (2012) allows experts to spatially collaborate with others at any other place in the world without traveling and thereby creating the experience of being virtually co-located. In the field of crime scene investigation, a remote expert guides and collaborates with a colleague in a crime scene to collect evidence. In this setting, the remote colleague wears a head-mounted display with a camera and shares the local view with the remote colleague, while both can annotate what they see with virtual objects. Datcu et al. (2014a, b) use AR to establish virtual co-location and improve the telepresence of a remote colleague for police investigations, fire fighting or reconnaissance. In 3DReMoTe (Huang and Alem 2013; Huang et al. 2013), the expert sees a real time a reconstruction & display of a remote workspace via a HMD. The hands of the expert are captured in 3D and co-located in the 3D remote workspace, allowing the expert to assist in spatial tasks. These projects show that AR can enable virtual co-location and allow experts at a distance to interact with local users to perform collaborative tasks.

3 Content of the special issue

The articles in this special issue present results that continue the line of research as outlined in the previous section. They provide novel insights on remote collaboration in augmented reality and bring us closer towards the combined vision of Tad Williams and Vernor Vinge.

In the first article, Pavel Gurevich, Joel Lanir and Benjamin Cohen describe the design and implementation of TeleAdvisor, a projection-based AR system for remote collaboration. Using a small projector and two cameras mounted on top of a tele-operated robotic arm at the local user’s side, TeleAdvisor allows a remote user to guide the local user in physical tasks that need to be performed. They show that their solution supports flexibility of movements, representational gestures, mixed perspectives, and control of both worker and helper. Based on their findings, they discuss the general implications for design of future AR remote collaboration systems.

Next, Matt Adcock and Chris Gunn present a similar approach for providing remote guidance to a local worker. Their system allows a remote expert to see the workplace from a local worker’s point of view. The remote expert can then draw AR annotations to guide the local worker. The annotations are projected onto the real world and can be made to stick to one physical location. Compared to the previous article, however, Adcock and Gunn do not use a robotic arm, but developed a system that can be worn by the local worker. In their user study, they evaluate the user appreciation of a wearable system compared to a fixed AR system, as well as user appreciation of sticky annotations (e.g., digital annotations that stick to physical objects) compared to digital annotations that move with users movement. Their evaluation suggests that users prefer the sticky annotations, however, they also show that user performance is better when using a fixed setup compared to a wearable setup.

The two previous articles describe solutions to provide remote guidance to a local user. In remote guidance scenarios, the remote user often views the local scene from the point of view of the local user. In the third paper, Matthew Tait and Mark Billinghurst discuss the effect of view independence for the remote user in such a scenario. For that purpose, they present a collaborative AR system that allows a remote user to assist a local user in an object placement task. Unlike most other systems, their AR system allows the remote user to have an independent view into the shared task space. In a user study on remote assistance for a two-dimensional object placement task, they show that increased view independence leads to faster task completion and a decrease in time spent on communication during the task.

The final paper of the special issue by Stephan Lukosch, Heide Lukosch, Dragoş Datcu and Marina Cidota discusses the use of AR for improving situational awareness of teams in the security domain. Lukosch et al. present a collaborative AR system that allows a local and remote user to jointly annotate a local scene using AR. The remote user can provide additional information on the spot. The presented system has been evaluated in two rounds with experts from different operational units in the security domain. The evaluation shows that the AR system can successfully support information exchange in teams operating in the security domain and that it can improve the team situational awareness. The evaluation also showed that the participating experts especially valued the guidance of the remote expert, as well as the possibility to better exchange information among teams from different organization in the security domain.

4 Research agenda

The four articles in this special issue focus on remote collaboration scenarios and the possibility to provide remote assistance, guidance or additional information to local users. All the articles also highlight the necessity for future research. Gurevich et al. stress that is it necessary to further research the support for remote users and how virtual content is visualized. Adcock and Gunn point out that further research is necessary to understand which tasks are suitable for remote guidance and how the local site is presented to the remote user with regard to issues around presence and efficiency of support. Tait and Billinghurst conclude that further research is needed to improve the usability of the 3D user interface used by the remote expert. Lukosch et al. argue that further research is necessary to understand the impact of AR on team situational awareness. Looking further, they consider user scenarios beyond the remote guidance scenario and ask the question to what extent can AR be used in training situations and how realistic such environments have to be in order to achieve good training outcomes..

Future research on collaboration in AR will thus have to focus on exploring situations that are suitable for virtual co-location. Further experiments need to be conducted to explore how remote users can interact with the local users and how their presence and awareness can be improved. Typically, a remote user perceives the local site via a standard desktop user interface. It is an open question whether more immersive 3D visualisations for remote users by, e.g., using a HMD, will impact the interaction, Presence and awareness in virtual co-location scenarios.

Dubois et al. (1999) discuss the evolution of interaction paradigms from graphical user interfaces to tangible user interfaces, and Augmented Reality. For future collaborative AR systems, there is a need to research which interaction paradigms will be most effective. Starting from proven concepts for 3D user interfaces (Bowman et al. 2004) and remote collaboration (Schümmer and Lukosch 2007), research will have to go beyond current AR systems and create effective 3D user interfaces for groups of multiple local as well as remote users. To enhance the virtual co-location experience and to facilitate the remote guiding task, more research is needed to explore how to further enhance the communication bandwidth between users. Current research efforts around multimodal interactions combining gestures, audio and annotations (Huang and Alem 2013), around using physical objects as interaction means (Datcu et al. 2015), or around how to provide remote users with a full control of their view, where they are able to zoom in and out, change their point of view as they see fit (as pointed out by Tait and Billinghurst in this special issue) are early steps in this direction.

To enhance the perception of presence of remote users, the effect of stimulating other senses (e.g., olfactory) on the perception of presence and situational awareness needs to be researched. First steps in this direction, have been done by (Narumi et al. 2011) who created an AR system that changes the perceived taste of a cookie by overlaying visual and olfactory information onto a real cookie.

Finally, more intuitive interaction possibilities among virtual and real objects need to be developed. Schraffenberger and van der Heide (2013) explored how real objects can affect virtual objects and vice versa. Using several examples, Schraffenberger and van der Heide (2013) argue that virtual and real objects can simulate influences that exist between real entities. Going even further, they give examples on how virtual and real objects can influence each other in new and imaginary ways. Based on these insights, future research needs to explore how distributed users can be empowered to interact with the environment and each other.

In summary, to create an AR system that supports remote collaboration and establishes virtual co-location, research on suitable scenarios, interaction paradigms, Presence and situational awareness needs to be conducted. This special issue on collaboration in Augmented Reality can be seen as a stepping-stone toward the future vision that is outlined in the novels from Williams and Vinge. It is hoped that this special issue will encourage other researchers to become part of the fascinating research and help shape this future.


  1. Alem, Leila, and Jane Li (2011). A Study of Gestures in a Video-Mediated Collaborative Assembly Task. Advances in Human-Computer Interaction, vol. 2011, Article ID 987830, 7 pages.Google Scholar
  2. Azuma, Ronald T. (1997). A Survey of Augmented Reality. In Presence: Teleoperators and Virtual Environments, vol. 6, no. 4, pp. 355–385.Google Scholar
  3. Azuma, Ronald T., Yohan Baillot, Reinhold Behringer, Steven Feiner, Simon Julier, and Blair MacIntyre (2001). Recent advances in augmented reality. Computer Graphics and Applications, vol. 21, no. 6, pp. 34–47.CrossRefGoogle Scholar
  4. Baird, K. M., and Woodrow Barfield (1999). Evaluating the effectiveness of augmented reality displays for a manual assembly task. Virtual Reality, vol. 4, no. 4, pp. 250–259.CrossRefGoogle Scholar
  5. Bauer, Martin, Gerd Kortuem, and Zary Segall (1999). “Where Are You Pointing At?” A Study of Remote Collaboration in a Wearable Videoconference System. In Proceedings of the 3rd IEEE International Symposium on Wearable Computers, San Francisco, CA, USA, 18–19 October 1999, Washington, DC, USA: IEEE Computer Society, pp. 151–158.Google Scholar
  6. Benko, Hrvoje, Edward W. Ishak, and Steven Feiner (2004). Collaborative Mixed Reality Visualization of an Archaeological Excavation. In Proceedings of the 3rd IEEE/ACM International Symposium on Mixed and Augmented Reality, Arlington, VA, USA, 2–5 November 2004, Washington, DC, USA: IEEE Computer Society, pp. 132–140.Google Scholar
  7. Billinghurst, Mark and Hirokazu Kato (1999). Collaborative Mixed Reality. In Proceedings of the First International Symposium on Mixed Reality (ISMR’99). Mixed Reality – Merging Real and Virtual Worlds, Yokohama, Japan, 19–21 March 1999, Berlin, Germany: Springer Verlag, pp. 261–284.Google Scholar
  8. Billinghurst, Mark and Hirokazu Kato (2002). Collaborative augmented reality. Communications of the ACM, vol. 45, no. 7, pp. 64–70.CrossRefGoogle Scholar
  9. Billinghurst, Mark and Bruce H. Thomas (2011). Mobile Collaborative Augmented Reality. In W. Huang and L. Alem (Ed.), Recent Trends of Mobile Collaborative Augmented Reality Systems, Dordrecht etc.: Springer, pp. 1–19.CrossRefGoogle Scholar
  10. Bowman, Doug A., Ernst Kruijff, Joseph J. LaViola Jr, and Ivan Poupyrev (2004). 3D User Interfaces: Theory and Practice. Boston etc.: Addison-Wesley.Google Scholar
  11. Butz, Andreas, Tobias Höllerer, Steven Feiner, Blair MacIntyre, and Clifford Beshers (1999). Enveloping users and computers in a collaborative 3D augmented reality. In Proceedings of the 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR’99), San Francisco, CA, USA, 20–21 October 1999, pp. 35–44.Google Scholar
  12. Carmigniani, Julie, Borko Furht, Marco Anisetti, Paolo Ceravolo, Ernesto Damiani, and Misa Ivkovic (2011). Augmented reality technologies, systems and applications. Multimedia Tools and Applications, vol. 51, no. 1, pp. 341–377.CrossRefGoogle Scholar
  13. Datcu, Dragos, Marina Cidota, Heide Lukosch, and Stephan Lukosch (2014a). On the Usability of Augmented Reality for Information Exchange in Teams from the Security Domain. In IEEE Joint Intelligence and Security Informatics Conference (JISIC), The Hague, Netherlands, 24–26 September 2014, Washington, DC, USA: IEEE Computer Society, pp. 160–167.Google Scholar
  14. Datcu, Dragos, Marina Cidota, Heide Lukosch, and Stephan Lukosch (2014b). [Poster] Using Augmented Reality to Support Information Exchange of Teams in the Security Domain. In IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Munich, Germany, 10–12 September 2014, Washington, DC, USA: IEEE Computer Society, pp. 263–264.Google Scholar
  15. Datcu, Dragos, Stephan Lukosch, and Frances Brazier (2015). On the Usability and Effectiveness of Different Interaction Types in Augmented Reality. International Journal of Human-Computer Interaction, vol. 31, no. 3, pp. 193–209.CrossRefGoogle Scholar
  16. Dong, Suyang, Amir H. Behzadan, Feng Chen, and Vineet R. Kamat (2013). Collaborative Visualization of Engineering Processes Using Tabletop Augmented Reality. Advances in Engineering Software, vol. 55, pp. 45–55.CrossRefGoogle Scholar
  17. Dubois, Emmanuel, Laurence Nigay, Jocelyne Troccaz, Olivier Chavanon, and Lionel Carrat (1999). Classification Space for Augmented Surgery, an Augmented Reality Case Study. In A. Sasse & C. Johnson (Eds.), Proceedings of Interact’99, Edinburgh, UK, 30 August - 3 September 1999, IOS Press, pp. 353–359.Google Scholar
  18. Gauglitz, Steffen, Benjamin Nuernberger, Matthew Turk, and Tobias Höllerer (2014a). In Touch with the Remote World: Remote Collaboration with Augmented Reality Drawings and Virtual Navigation. In Proceedings of the 20th ACM Symposium on Virtual Reality Software and Technology, Edinburgh, UK, November 11–13 2014, New York, NY, USA: ACM, pp. 197–205.Google Scholar
  19. Gaver, William W. (1991). Sound Support for Collaboration. In Proceedings of the Second Conference on European Conference on Computer-Supported Cooperative Work, Kluwer Academic Publishers, Norwell, MA, USA, pp. 293–308.Google Scholar
  20. Hammad, Amin, Hui Wang, and Sudhir P. Mudur (2009). Distributed Augmented Reality for Visualizing Collaborative Construction Tasks. Journal of Computing in Civil Engineering, vol. 23, no. 6, pp. 418–427.CrossRefGoogle Scholar
  21. Höllerer, Tobias, Steven Feiner, Tachio Terauchi, Gus Rashid, and Drexel Hallaway (1999). Exploring MARS: developing indoor and outdoor user interfaces to a mobile augmented reality system. Computers & Graphics, vol. 23, no. 6, pp. 779 – 785.CrossRefGoogle Scholar
  22. Huang, Weidong and Leila Alem (2013). Gesturing in the Air: Supporting Full Mobility in Remote Collaboration on Physical Tasks. Journal of Universal Computer Science (J.UCS), vol. 19, no. 8, pp. 1158–1174.Google Scholar
  23. Huang, Weidong, Leila Alem, and Franco Tecchia (2013). HandsIn3D: Supporting Remote Guidance with Immersive Virtual Environments. In P. Kotzé, G. Marsden, G. Lindgaard, J. Wesson, & M. Winckler (Eds.), Human-Computer Interaction - INTERACT 2013, Cape Town, South Africa, September 2–6 2013, Heidelberg New York Dordrecht London: Springer, pp. 70–77.Google Scholar
  24. Huynh, Duy-Nguyen Ta, Karthik Raveendran, Yan Xu, Kimberly Spreen, and Blair MacIntyre (2009). Art of defense: a collaborative handheld augmented reality board game. In Proceedings of the 2009 ACM SIGGRAPH Symposium on Video Games, New Orleans, LA, USA, 3–7 August 2009, New York, NY, USA: ACM, pp. 135–142.Google Scholar
  25. Kim, Hyunsu, Jongjin Park, Kunbae Noh, Calvin J. Gardner, Seong Deok Kong, Jongmin Kim, and Sungho Jin (2011). An X-Y Addressable Matrix Odor-Releasing System Using an On-Off Switchable Device. Angewandte Chemie, vol. 123, no. 30, pp. 6903–6907.CrossRefGoogle Scholar
  26. Klinker, Gudrun, Allen H. Dutoit, Martin Bauer, Johannes Bayer, Vinko Novak, and Dietmar Matzke (2002). Fata Morgana – A Presentation System for Product Design. In Proceedings of the 1st International Symposium on Mixed and Augmented Reality, Darmstadt, Germany, 30 September - 1 October 2002, Washington, DC, USA: IEEE Computer Society, pp. 1–10.Google Scholar
  27. Lucas, George (1977). Star Wars [Motion Picture]. United States: 20th Century Fox.Google Scholar
  28. Milgram, Paul and Fumio Kishino (1994). A taxonomy of mixed reality visual displays. IEICE Transactions on Information Systems, vol. E77-D, no. 12, pp. 1321–1329.Google Scholar
  29. Minatani, Shinya, Itaru Kitahara, Yoshinari Kameda, and Yuichi Ohta (2007). Face-to-Face Tabletop Remote Collaboration in Mixed Reality. In Proceedings of the 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, Nara, Japan, 13–16 November 2007, Washington, DC, USA: IEEE Computer Society, pp. 1–4.CrossRefGoogle Scholar
  30. Narumi, Takuji, Shinya Nishizaka, Takashi Kajinami, Tomohiro Tanikawa, and Michitaka Hirose (2011). Meta Cookie+: An Illusion-Based Gustatory Display. In R. Shumaker (Ed.), Virtual and Mixed Reality - New Trends, Springer Berlin Heidelberg, pp. 260–269.CrossRefGoogle Scholar
  31. Nilsson, Susanna, Björn Johansson, and Arne Jönsson (2009). Using AR to support cross-organisational collaboration in dynamic tasks. In 8th IEEE International Symposium on Mixed and Augmented Reality, Orlando, FL, USA, 19–22 October 2009, Washington, DC: IEEE Computer Society, pp. 3–12.CrossRefGoogle Scholar
  32. Oda, Ohan, Mengu Sukan, Steven Feiner, and Barbara Tversky (2013). Poster: 3D referencing for remote task assistance in augmented reality. In IEEE Symposium on 3D User Interfaces (3DUI 2013), Orlando, FL, USA, 1617 March 2013, Washington, DC, USA: IEEE Computer Society, pp. 179–180.Google Scholar
  33. Olson, Gary M. and Judith S. Olson (2000). Distance Matters. Human-Computer Interaction, vol. 15, nos. 2–3, pp. 139–178.CrossRefGoogle Scholar
  34. Olson, Judith S. and Gray M. Olson (2014). How to Make Distance Work Work. Interactions, vol. 21, no. 2, pp. 28–35.MathSciNetCrossRefGoogle Scholar
  35. Pentenrieder, Katharina, Christian Bade, Fabian Doil, Peter Meier (2007). Augmented Reality-based factory planning - an application tailored to industrial needs. In Proceedings of the 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, Nara, Japan, 13–16 November 2007, Washington, DC, USA: IEEE Computer Society, pp. 31–42.Google Scholar
  36. Piekarski, Wayne and Bruce H. Thomas (2009). Through-Walls Collaboration. IEEE Pervasive Computing, vol. 8, no. 3, pp. 42–49.CrossRefGoogle Scholar
  37. Piirainen, Kalle, Gwendolyn Kolfschoten, and Stephan Lukosch (2012). The Joint Struggle of Complex Engineering: A Study of the Challenges of Collaborative Design. International Journal of Information Technology & Decision Making (IJITDM), vol. 11, no. 6, pp. 1–39.Google Scholar
  38. Poelman, Ronald, Oytun Akman, Stephan Lukosch, and Pieter Jonker (2012). As if Being There: Mediated Reality for Crime Scene Investigation. In CSCW’12: Proceedings of the 2012 ACM conference on Computer Supported Cooperative Work, San Francisco, CA, USA, 27 February – 2 March 2016, New York, NY, USA: ACM, pp. 1267–1276.Google Scholar
  39. Regenbrecht, Holger T., Michael T. Wagner, and Gregory Baratoff (2002). MagicMeeting: A Collaborative Tangible Augmented Reality System. Virtual Reality, vol. 6, no. 3, pp. 151–166.CrossRefGoogle Scholar
  40. Rekimoto, Jun (1996). Transvision: a hand-held augmented reality system for collaborative design. In Proceeding of Virtual Systems and Multimedia (VSSM’96), Gifu, Japan, September 18–20, 1996. Google Scholar
  41. Samur, Evren (2012). State of the Art. In Performance Metrics for Haptic Interfaces. London, UK: Springer, pp. 9–26.CrossRefGoogle Scholar
  42. Schmalstieg, Dieter, Anton Fuhrmann, Gerd Hesina, Zsolt Szalavári, L. Miguel Encarnacao, Michael Gervautz, and Werner Purgathofer (2002). The Studierstube Augmented Reality Project. Presence: Teleoperators and Virtual Environments, vol. 11, no. 1, pp. 33–54.CrossRefGoogle Scholar
  43. Schraffenberger, Hanna and Edwin van der Heide (2013). From coexistence to interaction: influences between the virtual and the real in augmented reality. In Proceedings of the 19th International Symposium on Electronic Art, ISEA2013, Sidney, Australia, 7–16 June 2013.Google Scholar
  44. Schümmer, Till and Stephan Lukosch (2007). Patterns for Computer-Mediated Interaction. Chichester, West Sussex, England: John Wiley & Sons, Ltd.Google Scholar
  45. Stafford, Aaron, Wayne Piekarski, Bruce H. Thomas. (2006). Implementation of god-like interaction techniques for supporting collaboration between outdoor AR and indoor tabletop users. In Proceedings of the 5th IEEE and ACM International Symposium on Mixed and Augmented Reality, Santa Barbara, CA, USA, 22–25 October 2006, Washington, DC, USA: IEEE Computer Society, pp. 165–172.CrossRefGoogle Scholar
  46. Sutherland, Ivan E. (1965). The Ultimate Display. In Proceedings of the Congress of the Internation Federation of Information Processing (IFIP), New York, USA, May 24–29, 1965, Washington, USA: Spartan Books, vol. 2, pp. 506–508.Google Scholar
  47. Szalavári, Zsolt, Dieter Schmalstieg, Anton Fuhrmann, Michael Gervautz (1998). “Studierstube”: An Environment for collaboration in augmented reality. Virtual Reality, vol. 3, no. 1, pp. 37–48.CrossRefGoogle Scholar
  48. Tang, Arthur, Charles Owen, Frank Biocca, and Weimin Mou (2003). Comparative effectiveness of augmented reality in object assembly. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Fort Lauderdale, FL, USA, 5–10 April 2003, New York, NY, USA: ACM, pp. 73–80.Google Scholar
  49. Vinge, Vernor (2007). Rainbows End. New York City, USA: Tor Books.Google Scholar
  50. Wang, Xiangyu and Phillip S. Dunston (2011). Comparative Effectiveness of Mixed Reality-Based Virtual Environments in Collaborative Design. IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews, vol. 41, no. 3, pp. 284–296.CrossRefGoogle Scholar
  51. Williams, Tad (1996). Otherland - City of Golden Shadow. London, UK: Legend Books.Google Scholar
  52. Williams, Tad (1998). Otherland - River of Blue Fire. London, UK: Legend Books.Google Scholar
  53. Williams, Tad (1999). Otherland - Mountain of Black Glass. London, UK: Legend Books.Google Scholar
  54. Williams, Tad (2001). Otherland - Sea of Silver Light. London, UK: Legend Books.Google Scholar

Copyright information

© The Author(s) 2015

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Authors and Affiliations

  • Stephan Lukosch
    • 1
  • Mark Billinghurst
    • 2
  • Leila Alem
    • 3
  • Kiyoshi Kiyokawa
    • 4
  1. 1.Faculty of Technology Policy and ManagementDelft University of TechnologyDelftThe Netherlands
  2. 2.School of Information Technology and Mathematical SciencesUniversity of South AustraliaAdelaideAustralia
  3. 3.ThoughtworksSydneyAustralia
  4. 4.Osaka UniversitySuitaJapan

Personalised recommendations