Skip to main content
Log in

Interface of mixed reality: from the past to the future

  • Review Paper
  • Published:
CCF Transactions on Pervasive Computing and Interaction Aims and scope Submit manuscript

Abstract

Mixed reality (MR) is an emerging technology which could potentially shape the future of our everyday lives by its unique approach to presenting information. Technology is changing rapidly and information can be presented on traditional computer screens following a WIMP (Windows, Icons, Menus, and Pointing) interface model, by using a head-mounted display to present virtual reality, or by MR which the process of presenting information through a combination of both virtual and physical elements. This paper classifies MR interfaces by applying a text mining method to a data base of 4296 relevant research papers published over the last two decades. The classification reveals the trends relating to each topic and the relations between them. This paper reviews the earlier studies and discusses the recent developments in each topic area and summarizes the advantages and disadvantages of the MR interface. Our objective is to assist researchers understand the trend for each topic and allows them to focus on the research challenges where technological advancements in the MR interface are most needed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  • Abhari, K., et al.: Training for planning tumour resection: augmented reality and human factors. IEEE Trans. Biomed. Eng. 62(6), 1466–1477 (2015)

    Article  Google Scholar 

  • Abowd, G.D., Mynatt, E.D.: Charting past, present, and future research in ubiquitous computing. ACM Trans. Comput. Hum. Interact. (TOCHI) 7(1), 29–58 (2000)

    Article  Google Scholar 

  • Andersen, D., et al.: Medical telementoring using an augmented reality transparent display. Surgery 159(6), 1646–1653 (2016)

    Article  Google Scholar 

  • Aromaa, S., Väänänen, K.: Suitability of virtual prototypes to support human factors/ergonomics evaluation during the design. Appl. Ergon. 56, 11–18 (2016)

    Article  Google Scholar 

  • Arroyave-Tobón, S., Osorio-Gómez, G., Cardona-McCormick, J.F.: Air-modelling: a tool for gesture-based solid modelling in context during early design stages in AR environments. Comput. Ind. 66, 73–81 (2015)

    Article  Google Scholar 

  • Ates, H.C., Fiannaca, A., Folmer, E.: Immersive simulation of visual impairments using a wearable see-through display. In: Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction, pp. 225–228. ACM (2015)

  • Azuma, R., Baillot, Y., Behringer, R., Feiner, S., Julier, S., MacIntyre, B.: Recent advances in augmented reality. IEEE Comput. Graph. Appl. 21(6), 34–47 (2001)

    Article  Google Scholar 

  • Barsom, E., Graafland, M., Schijven, M.: Systematic review on the effectiveness of augmented reality applications in medical training. Surg. Endosc. 30(10), 4174–4183 (2016)

    Article  Google Scholar 

  • Beaudouin-Lafon, M.: Instrumental interaction: an interaction model for designing post-WIMP user interfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 446–453. ACM (2000)

  • Bell, B., Feiner, S., Höllerer, T.: View management for virtual and augmented reality. In: Proceedings of the 14th Annual ACM Symposium on User Interface Software and Technology, pp. 101–110. ACM (2001)

  • Benford, S., et al.: The frame of the game: blurring the boundary between fiction and reality in mobile experiences. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 427–436. ACM (2006)

  • Benford, S., et al.: Expected, sensed, and desired: a framework for designing sensing-based interaction. ACM Trans. Comput. Hum. Interact. (TOCHI) 12(1), 3–30 (2005)

    Article  Google Scholar 

  • Besharati Tabrizi, L., Mahvash, M.: Augmented reality-guided neurosurgery: accuracy and intraoperative application of an image projection technique. J. Neurosurg. 123(1), 206–211 (2015)

    Article  Google Scholar 

  • Bichlmeier, C., Wimmer, F., Heining, S.M., Navab, N.: Contextual anatomic mimesis hybrid in situ visualization method for improving multi-sensory depth perception in medical augmented reality. In: 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, 2007. ISMAR 2007, pp. 129–138. IEEE (2007)

  • Billinghurst, M., Kato, H.: Collaborative augmented reality. Commun. ACM 45(7), 64–70 (2002)

    Article  Google Scholar 

  • Billinghurst, M., Kato, H., Poupyrev, I.: The MagicBook: a transitional AR interface. Comput. Graph. 25(5), 745–753 (2001a)

    Article  Google Scholar 

  • Billinghurst, M., Kato, H., Poupyrev, I.: The magicbook-moving seamlessly between reality and virtuality. IEEE Comput. Graph. Appl. 21(3), 6–8 (2001b)

    Google Scholar 

  • Birkfellner, W., et al.: A head-mounted operating binocular for augmented reality visualization in medicine-design and initial evaluation. IEEE Trans. Med. Imaging 21(8), 991–997 (2002)

    Article  Google Scholar 

  • Blei, D.M., Ng, A.Y., Jordan, M.I.: Latent dirichlet allocation. J. Mach. Learn. Res. 3, 993–1022 (2003)

    MATH  Google Scholar 

  • Botden, S.M., Jakimowicz, J.J.: What is going on in augmented reality simulation in laparoscopic surgery? Surg. Endosc. 23(8), 1693 (2009)

    Article  Google Scholar 

  • Botden, S.M., Buzink, S.N., Schijven, M.P., Jakimowicz, J.J.: Augmented versus virtual reality laparoscopic simulation: what is the difference? World J. Surg. 31(4), 764–772 (2007)

    Article  Google Scholar 

  • Brancati, N., Caggianese, G., Frucci, M., Gallo, L., Neroni, P.: Touchless target selection techniques for wearable augmented reality systems. In: Intelligent Interactive Multimedia Systems and Services. Springer, pp. 1–9 (2015)

  • Brondi, R., et al.: Evaluating the impact of highly immersive technologies and natural interaction on player engagement and flow experience in games. In: International Conference on Entertainment Computing, pp. 169–181. Springer (2015)

  • Buchmann, V., Violich, S., Billinghurst, M., Cockburn, A.: FingARtips: gesture based direct manipulation in Augmented Reality. In: Proceedings of the 2nd International Conference on Computer Graphics and Interactive Techniques in Australasia and South East Asia, pp. 212–221. ACM (2004)

  • Butz, A., Hollerer, T., Feiner, S., MacIntyre, B., Beshers, C.: Enveloping users and computers in a collaborative 3D augmented reality. In: Proceedings of 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR’99), pp. 35–44. IEEE (1999)

  • Cabrilo, I., Schaller, K., Bijlenga, P.: Augmented reality-assisted bypass surgery: embracing minimal invasiveness. World Neurosurg. 83(4), 596–602 (2015)

    Article  Google Scholar 

  • Carmigniani, J., Furht, B., Anisetti, M., Ceravolo, P., Damiani, E., Ivkovic, M.: Augmented reality technologies, systems and applications. Multimed. Tools Appl. 51(1), 341–377 (2011)

    Article  Google Scholar 

  • Chandler, T., et al.: Immersive analytics. In: Big Data Visual Analytics (BDVA), pp. 1–8. IEEE (2015)

  • Chatzidimitris, T., Gavalas, D., Michael, D.: SoundPacman: audio augmented reality in location-based games. In: 2016 18th Mediterranean Electrotechnical Conference (MELECON), pp. 1–6. IEEE (2016)

  • Chen, S., Duh, H.: Mixed reality in education: recent developments and future trends. In 2018 IEEE 18th International Conference on Advanced Learning Technologies (ICALT), pp. 367–371 (2018)

  • Chen, C.-M., Tsai, Y.-N.: Interactive augmented reality system for enhancing library instruction in elementary schools. Comput. Educ. 59(2), 638–652 (2012)

    Article  Google Scholar 

  • Chen, X., et al.: Development of a surgical navigation system based on augmented reality using an optical see-through head-mounted display. J. Biomed. Inform. 55, 124–131 (2015)

    Article  Google Scholar 

  • Chen, L., Day, T.W., Tang, W., John, N.W.: Recent developments and future challenges in medical mixed reality. IEEE Int. Symp. Mixed Augment. Real. (ISMAR) 2017, 123–135 (2017)

    Google Scholar 

  • Choi, H., Cho, B., Masamune, K., Hashizume, M., Hong, J.: An effective visualization technique for depth perception in augmented reality-based surgical navigation. Int. J. Med. Robot. Comput. Assist. Surg. 12(1), 62–72 (2016)

    Article  Google Scholar 

  • Coles, T.R., John, N.W., Gould, D., Caldwell, D.G.: Integrating haptics with augmented reality in a femoral palpation and needle insertion training simulation. IEEE Trans. Haptics 4(3), 199–209 (2011)

    Article  Google Scholar 

  • Colomer, C., Llorens, R., Noé, E., Alcañiz, M.: Effect of a mixed reality-based intervention on arm, hand, and finger function on chronic stroke. J. Neuroeng. Rehabil. 13(1), 45 (2016)

    Article  Google Scholar 

  • Comport, A.I., Marchand, E., Pressigout, M., Chaumette, F.: Real-time markerless tracking for augmented reality: the virtual visual servoing framework. IEEE Trans. Vis. Comput. Graph. 12(4), 615–628 (2006)

    Article  Google Scholar 

  • Datcu, D., Lukosch, S., Brazier, F.: On the usability and effectiveness of different interaction types in augmented reality. Int. J. Hum. Comput. Interact. 31(3), 193–209 (2015)

    Article  Google Scholar 

  • Davis, M.C., Can, D.D., Pindrik, J., Rocque, B.G., Johnston, J.M.: Virtual interactive presence in global surgical education: international collaboration through augmented reality. World Neurosurg. 86, 103–111 (2016)

    Article  Google Scholar 

  • Dey, A., Billinghurst, M., Lindeman, R.W., Swan II, J.E.: A systematic review of usability studies in augmented reality between 2005 and 2014. In: 2016 IEEE International Symposium Mixed and Augmented Reality (ISMAR-Adjunct), pp. 49–50. IEEE (2016)

  • Di Fuccio, R., Ponticorvo, M., Di Ferdinando, A., Miglino, O.: Towards hyper activity books for children. connecting activity books and montessori-like educational materials. In: Design for Teaching and Learning in a Networked World, pp. 401–406. Springer (2015)

  • Dickey, R.M., Srikishen, N., Lipshultz, L.I., Spiess, P.E., Carrion, R.E., Hakky, T.S.: Augmented reality assisted surgery: a urologic training tool. Asian J. Androl. 18(5), 732 (2016)

    Article  Google Scholar 

  • Dixon, B.J., Daly, M.J., Chan, H., Vescan, A.D., Witterick, I.J., Irish, J.C.: Surgeons blinded by enhanced navigation: the effect of augmented reality on attention. Surg. Endosc. 27(2), 454–461 (2013)

    Article  Google Scholar 

  • dos Santos, L.F., Christ, O., Mate, K., Schmidt, H., Krüger, J., Dohle, C.: Movement visualisation in virtual reality rehabilitation of the lower limb: a systematic review. Biomed. Eng. Online 15(3), 144 (2016)

    Article  Google Scholar 

  • Dunleavy, M., Dede, C., Mitchell, R.: Affordances and limitations of immersive participatory augmented reality simulations for teaching and learning. J. Sci. Educ. Technol. 18(1), 7–22 (2009)

    Article  Google Scholar 

  • Elsevier (2018). https://www.elsevier.com/solutions/scopus/content. Accessed 15 Jan 2018

  • Feiner, S., MacIntyre, B., Höllerer, T., Webster, A.: A touring machine: prototyping 3D mobile augmented reality systems for exploring the urban environment. Pers. Technol. 1(4), 208–217 (1997)

    Article  Google Scholar 

  • Ferrer-Torregrosa, J., Jiménez-Rodríguez, M.Á., Torralba-Estelles, J., Garzón-Farinós, F., Pérez-Bermejo, M., Fernández-Ehrling, N.: Distance learning ects and flipped classroom in the anatomy learning: comparative study of the use of augmented reality, video and notes. BMC Med. Educ. 16(1), 230 (2016)

    Article  Google Scholar 

  • Fjeld, M., et al.: Tangible user interface for chemistry education: comparative evaluation and re-design. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 805–808. ACM (2007)

  • Fjeld, M., Lauche, K., Bichsel, M., Voorhorst, F., Krueger, H., Rauterberg, M.: Physical and virtual tools: activity theory applied to the design of groupware. Comput. Support. Coop. Work (CSCW) 11(1–2), 153–180 (2002)

    Article  Google Scholar 

  • Flatt, H., Koch, N., Röcker, C., Günter, A., Jasperneite, J.: A context-aware assistance system for maintenance applications in smart factories based on augmented reality and indoor localization. In: 2015 IEEE 20th Conference on Emerging Technologies and Factory Automation (ETFA), pp. 1–4. IEEE (2015)

  • Flintham, M., et al.: Where on-line meets on the streets: experiences with mobile mixed reality games. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 569–576. ACM (2003)

  • Freitas, R., Campos, P.: SMART: a SysteM of Augmented Reality for Teaching 2nd grade students. In: Proceedings of the 22nd British HCI Group Annual Conference on People and Computers: Culture, Creativity, Interaction, vol. 2, pp. 27–30. BCS Learning and Development Ltd. (2008)

  • Galambos, P., et al.: Design, programming and orchestration of heterogeneous manufacturing systems through VR-powered remote collaboration. Robot. Comput. Integr. Manuf. 33, 68–77 (2015)

    Article  Google Scholar 

  • Gillet, A., Sanner, M., Stoffler, D., Goodsell, D., Olson, A.: Augmented reality with tangible auto-fabricated models for molecular biology applications. In: IEEE Visualization, pp. 235–241. IEEE (2004)

  • Gillet, A., Sanner, M., Stoffler, D., Olson, A.: Tangible interfaces for structural molecular biology. Structure 13(3), 483–491 (2005)

    Article  Google Scholar 

  • Gordon, N., Brayshaw, M., Aljaber, T.: Heuristic evaluation for serious immersive games and M-instruction. In: International Conference on Learning and Collaboration Technologies, pp. 310–319. Springer (2016)

  • Górski, F., Buń, P., Wichniarek, R., Zawadzki, P., Hamrol, A.: Immersive city bus configuration system for marketing and sales education. Procedia Comput. Sci. 75, 137–146 (2015)

    Article  Google Scholar 

  • Grubert, J., Langlotz, T., Zollmann, S., Regenbrecht, H.: Towards pervasive augmented reality: context-awareness in augmented reality. IEEE Trans. Vis. Comput. Graph. 23(6), 1706–1724 (2017)

    Article  Google Scholar 

  • Haouchine, N., Dequidt, J., Peterlik, I., Kerrien, E., Berger, M.-O., Cotin, S.: Image-guided simulation of heterogeneous tissue deformation for augmented reality during hepatic surgery. In: 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 199–208. IEEE (2013)

  • Henderson, S., Feiner, S.: Exploring the benefits of augmented reality documentation for maintenance and repair. IEEE Trans. Vis. Comput. Graph. 17(10), 1355–1368 (2011)

    Article  Google Scholar 

  • Hettiarachchi, A., Wigdor, D.: Annexing reality: enabling opportunistic use of everyday objects as tangible proxies in augmented reality. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 1957–1967. ACM (2016)

  • Höllerer, T., Feiner, S., Terauchi, T., Rashid, G., Hallaway, D.: Exploring MARS: developing indoor and outdoor user interfaces to a mobile augmented reality system. Comput. Graph. 23(6), 779–785 (1999)

    Article  Google Scholar 

  • Hong, I., et al.: 18.1 A 2.71 nJ/pixel 3D-stacked gaze-activated object-recognition system for low-power mobile HMD applications. In: 2015 IEEE International Solid-State Circuits Conference-(ISSCC), pp. 1–3. IEEE (2015)

  • Huang, Z., Li, W., Hui, P.: Ubii: towards seamless interaction between digital and physical worlds. In: Proceedings of the 23rd ACM International Conference on Multimedia, pp. 341–350. ACM (2015)

  • Huang, J., Mori, T., Takashima, K., Hashi, S., Kitamura, Y.: IM6D: magnetic tracking system with 6-DOF passive markers for dexterous 3D interaction and motion. ACM Trans. Graph. (TOG) 34(6), 217 (2015)

    Article  Google Scholar 

  • Iseki, H., et al.: Volumegraph (overlaid three-dimensional image-guided navigation). Stereotact. Funct. Neurosurg. 68(1–4), 18–24 (1997)

    Article  Google Scholar 

  • Ishii, H., Ullmer, B.: Tangible bits: towards seamless interfaces between people, bits and atoms. In: Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems, pp. 234–241. ACM (1997)

  • Ishii, H., Wisneski, C., Orbanes, J., Chun, B., Paradiso, J.: PingPongPlus: design of an athletic-tangible interface for computer-supported cooperative play. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 394–401. ACM (1999)

  • Ishii, H.: Tangible bits: beyond pixels. In: Proceedings of the 2nd International Conference on Tangible and Embedded Interaction, pp. xv–xxv. ACM (2008)

  • Ismail, A.W., Sunar, M.S.: Multimodal fusion: gesture and speech input in augmented reality environment. In: Computational Intelligence in Information Systems, pp. 245–254. Springer (2015)

  • Itoh, Y., Dzitsiuk, M., Amano, T., Klinker, G.: Semi-parametric color reproduction method for optical see-through head-mounted displays. IEEE Trans. Vis. Comput. Graph. 21(11), 1269–1278 (2015)

    Article  Google Scholar 

  • Jacob, R., Stellmach, S.: What you look at is what you get: gaze-based user interfaces. Interactions 23(5), 62–65 (2016)

    Article  Google Scholar 

  • Jang, Y., Noh, S.-T., Chang, H.J., Kim, T.-K., Woo, W.: 3D finger cape: clicking action and position estimation under self-occlusions in egocentric viewpoint. IEEE Trans. Vis. Comput. Graph. 21(4), 501–510 (2015)

    Article  Google Scholar 

  • Kanbara, M., Takemura, H., Yokoya, N., Okuma, T.: A stereoscopic video see-through augmented reality system based on real-time vision-based registration. In: vr, p. 255. IEEE (2000)

  • Ke, F., Lee, S., Xu, X.: Teaching training in a mixed-reality integrated learning environment. Comput. Hum. Behav. 62, 212–220 (2016)

    Article  Google Scholar 

  • Kerawalla, L., Luckin, R., Seljeflot, S., Woolard, A.: “Making it real”: exploring the potential of augmented reality for teaching primary school science. Virtual Real. 10(3–4), 163–174 (2006)

    Article  Google Scholar 

  • Kiyokawa, K., Billinghurst, M., Hayes, S.E., Gupta, A., Sannohe, Y., Kato, H.: Communication behaviors of co-located users in collaborative AR interfaces. In: Proceedings of the 1st International Symposium on Mixed and Augmented Reality, p. 139. IEEE Computer Society (2002)

  • Klemmer, S.R., Li, J., Lin, J., Landay, J.A.: Papier-Mache: toolkit support for tangible input. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 399–406. ACM (2004)

  • Koike, H., Sato, Y., Kobayashi, Y.: Integrating paper and digital information on EnhancedDesk: a method for realtime finger tracking on an augmented desk system. ACM Trans. Comput. Hum. Interact. 8(4), 307–322 (2001)

    Article  Google Scholar 

  • Koller, D., Klinker, G., Rose, E., Breen, D., Whitaker, R., Tuceryan, M.: Real-time vision-based camera tracking for augmented reality applications. In: Proceedings of the ACM Symposium on Virtual Reality Software and Technology, pp. 87–94. ACM (1997)

  • Küçük, S., Kapakin, S., Göktaş, Y.: Learning anatomy via mobile augmented reality: effects on achievement and cognitive load. Anat. Sci. Educ. 9(5), 411–421 (2016)

    Article  Google Scholar 

  • Kumar, A., Smith, R., Patel, V.R.: Current status of robotic simulators in acquisition of robotic surgical skills. Curr. Opin. Urol. 25(2), 168–174 (2015)

    Article  Google Scholar 

  • Kutulakos, K.N., Vallino, J.R.: Calibration-free augmented reality. IEEE Trans. Vis. Comput. Graph. 4(1), 1–20 (1998)

    Article  Google Scholar 

  • Lee, T., Hollerer, T.: Handy AR: markerless inspection of augmented reality objects using fingertip tracking (2007)

  • Lee, K.-R., Chang, W.-D., Kim, S., Im, C.-H.: Real-time “eye-writing” recognition using electrooculogram. IEEE Trans. Neural Syst. Rehabil. Eng. 25(1), 37–48 (2017)

    Article  Google Scholar 

  • Li, G., Xi, N., Yu, M., Fung, W.-K.: Development of augmented reality system for AFM-based nanomanipulation. IEEE/ASME Trans. Mechatron. 9(2), 358–365 (2004)

    Article  Google Scholar 

  • Li, G., Xi, N., Chen, H., Pomeroy, C., Prokos, M.: “ Videolized” atomic force microscopy for interactive nanomanipulation and nanoassembly. IEEE Trans. Nanotechnol. 4(5), 605–615 (2005)

    Article  Google Scholar 

  • Lin, S., Cheng, H.F., Li, W., Huang, Z., Hui, P., Peylo, C.: Ubii: physical world interaction through augmented reality. IEEE Trans. Mob. Comput. 16(3), 872–885 (2017)

    Article  Google Scholar 

  • Lindgren, R., Tscholl, M., Wang, S., Johnson, E.: Enhancing learning and engagement through embodied interaction within a mixed reality simulation. Comput. Educ. 95, 174–187 (2016)

    Article  Google Scholar 

  • Loureiro, R., Amirabdollahian, F., Topping, M., Driessen, B., Harwin, W.: Upper limb robot mediated stroke therapy—GENTLE/s approach. Auton. Robots 15(1), 35–51 (2003)

    Article  Google Scholar 

  • Luhn, H.P.: A statistical approach to mechanized encoding and searching of literary information. IBM J. Res. Dev. 1(4), 309–317 (1957)

    Article  MathSciNet  Google Scholar 

  • Ma, M., et al.: Personalized augmented reality for anatomy education. Clin. Anat. 29(4), 446–453 (2016)

    Article  Google Scholar 

  • MacIntyre, B., Gandy, M., Dow, S., Bolter, J.D.: DART: a toolkit for rapid design exploration of augmented reality experiences. In: Proceedings of the 17th Annual ACM Symposium on User Interface Software and Technology, pp. 197–206. ACM (2004)

  • Malik, S., Laszlo, J.: Visual touchpad: a two-handed gestural input device. In: Proceedings of the 6th International Conference on Multimodal Interfaces, pp. 289–296. ACM (2004)

  • Marescaux, J., Smith, M.K., Fölscher, D., Jamali, F., Malassagne, B., Leroy, J.: Telerobotic laparoscopic cholecystectomy: initial clinical experience with 25 patients. Ann. Surg. 234(1), 1 (2001)

    Article  Google Scholar 

  • Megali, G., et al.: EndoCAS navigator platform: a common platform for computer and robotic assistance in minimally invasive surgery. Int. J. Med. Robot. Comput. Assist. Surg. 4(3), 242–251 (2008)

    Article  Google Scholar 

  • Milgram, P., Kishino, F.: A taxonomy of mixed reality visual displays. IEICE Trans. Inf. Syst. 77(12), 1321–1329 (1994)

    Google Scholar 

  • Mistry, P., Maes, P., Chang, L.: WUW-wear Ur world: a wearable gestural interface. In: CHI’09 Extended Abstracts on Human Factors in Computing Systems, pp. 4111–4116. ACM (2009)

  • Mitrasinovic, S., et al.: Clinical and surgical applications of smart glasses. Technol. Health Care 23(4), 381–401 (2015)

    Article  Google Scholar 

  • Murphy-Chutorian, E., Trivedi, M.M.: Head pose estimation and augmented reality tracking: an integrated system and evaluation for monitoring driver awareness. IEEE Trans. Intell. Transp. Syst. 11(2), 300–311 (2010)

    Article  Google Scholar 

  • Nielsen, M., Störring, M., Moeslund, T.B., Granum, E.: A procedure for developing intuitive and ergonomic gesture interfaces for HCI. In: International Gesture Workshop, pp. 409–420. Springer (2003)

  • Nuernberger, B., Lien, K.-C., Höllerer, T., Turk, M.: Interpreting 2D gesture annotations in 3D augmented reality. In: 2016 IEEE Symposium on 3D User Interfaces (3DUI), pp. 149–158. IEEE (2016)

  • Omar, T., Nehdi, M.L.: Data acquisition technologies for construction progress tracking. Autom. Constr. 70, 143–155 (2016)

    Article  Google Scholar 

  • Orlosky, J., Toyama, T., Kiyokawa, K., Sonntag, D.: Modular: eye-controlled vision augmentations for head mounted displays. IEEE Trans. Vis. Comput. Graph. 1, 1–1 (2015)

    Google Scholar 

  • Papagiannakis, G., Singh, G., Magnenat-Thalmann, N.: A survey of mobile and wireless technologies for augmented reality systems. Comput. Animat. Virtual Worlds 19(1), 3–22 (2008)

    Article  Google Scholar 

  • Park, S., Choi, S., Lee, J., Kim, M., Park, J., Yoo, H.-J.: 14.1 a 126.1 mw real-time natural ui/ux processor with embedded deep-learning core for low-power smart glasses. In: 2016 IEEE International Solid-State Circuits Conference (ISSCC), pp. 254–255. IEEE (2016)

  • Patten, J., Ishii, H., Hines, J., Pangaro, G.: Sensetable: a wireless object tracking platform for tangible user interfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 253–260. ACM (2001)

  • Pejsa, T., Kantor, J., Benko, H., Ofek, E., Wilson, A.: Room2room: enabling life-size telepresence in a projected augmented reality environment. In: Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work and Social Computing, pp. 1716–1725. ACM (2016)

  • Pelargos, P.E., et al.: Utilizing virtual and augmented reality for educational and clinical enhancements in neurosurgery. J. Clin. Neurosci. 35, 1–4 (2017)

    Article  Google Scholar 

  • Peppoloni, L., Brizzi, F., Avizzano, C.A., Ruffaldi, E.: Immersive ROS-integrated framework for robot teleoperation. In: 2015 IEEE Symposium on 3D User Interfaces (3DUI), pp. 177–178. IEEE (2015)

  • Piekarski, W., Thomas, B.H.: Tinmith-metro: new outdoor techniques for creating city models with an augmented reality wearable computer. In: Proceedings of Fifth International Symposium on Wearable Computers, pp. 31–38. IEEE (2001)

  • Ploennigs, J., Ba, A., Barry, M.: Materializing the promises of cognitive IoT: how cognitive buildings are shaping the way. IEEE Internet Things J. 5(4), 2367–2374 (2018)

    Article  Google Scholar 

  • Qamar, A.M., Khan, A.R., Husain, S.O., Rahman, M.A., Baslamah, S.: A multi-sensory gesture-based occupational therapy environment for controlling home appliances. In: Proceedings of the 5th ACM on International Conference on Multimedia Retrieval, pp. 671–674. ACM (2015)

  • Rekimoto, J., Ayatsuka, Y.: CyberCode: designing augmented reality environments with visual tags. In: Proceedings of DARE 2000 on Designing Augmented Reality Environments, pp. 1–10. ACM (2000)

  • Rekimoto, J., Saitoh, M.: Augmented surfaces: a spatially continuous work space for hybrid computing environments. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 378–385. ACM (1999)

  • Rodrigo, M., Caluya, N.R., Diy, W., Vidal, E.: Igpaw: intramuros—design of an augmented reality game for philippine history. In: Proceedings of the 23rd International Conference on Computers in Education (2015)

  • Ruppert, G.C.S., Reis, L.O., Amorim, P.H.J., de Moraes, T.F., da Silva, J.V.L.: Touchless gesture user interface for interactive image visualization in urological surgery. World J. Urol. 30(5), 687–691 (2012)

    Article  Google Scholar 

  • Sand, A., Rakkolainen, I., Isokoski, P., Kangas, J., Raisamo, R., Palovuori, K.: Head-mounted display with mid-air tactile feedback. In: Proceedings of the 21st ACM Symposium on Virtual Reality Software and Technology, pp. 51–58. ACM (2015)

  • Schmalstieg, D., et al.: The studierstube augmented reality project. Presence Teleoperators Virtual Environ. 11(1), 33–54 (2002)

    Article  Google Scholar 

  • Schubert, G., Schattel, D., Tönnis, M., Klinker, G., Petzold, F.: Tangible mixed reality on-site: interactive augmented visualisations from architectural working models in urban design. In: International Conference on Computer-Aided Architectural Design Futures, pp. 55–74. Springer (2015)

  • Schwabe, G., Göth, C.: Mobile learning with a mobile game: design and motivational effects. J. Comput. Assist. Learn. 21(3), 204–216 (2005)

    Article  Google Scholar 

  • Sebillo, M., Vitiello, G., Paolino, L., Ginige, A.: Training emergency responders through augmented reality mobile interfaces. Multimed. Tools Appl. 75(16), 9609–9622 (2016)

    Article  Google Scholar 

  • Shahrokni, H., Årman, L., Lazarevic, D., Nilsson, A., Brandt, N.: Implementing smart urban metabolism in the Stockholm Royal Seaport: smart city SRS. J. Ind. Ecol. 19(5), 917–929 (2015)

    Article  Google Scholar 

  • Shelton, B.E., Hedley, N.R.: Using augmented reality for teaching earth-sun relationships to undergraduate geography students. In: The First IEEE International Workshop on Augmented Reality Toolkit, vol. 8. IEEE (2002)

  • Shuhaiber, J.H.: Augmented reality in surgery. Arch. Surg. 139(2), 170–174 (2004)

    Article  Google Scholar 

  • Simões, B., Prandi, F., De Amicis, R.: Creativity support in projection-based augmented environments. In: International Conference on Augmented and Virtual Reality, pp. 168–187. Springer (2015)

  • Stadler, S., Kain, K., Giuliani, M., Mirnig, N., Stollnberger, G., Tscheligi, M.: Augmented reality for industrial robot programmers: Workload analysis for task-based, augmented reality-supported robot control. In: 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), pp. 179–184. IEEE (2016)

  • Starner, T., et al.: Augmented reality through wearable computing. Presence Teleoperators Virtual Environ. 6(4), 386–398 (1997)

    Article  Google Scholar 

  • Sweet, R.M.: The CREST simulation development process: training the next generation. J. Endourol. 31(1), S69–S75 (2017)

    Article  Google Scholar 

  • Sylaiou, S., Mania, K., Karoulis, A., White, M.: Exploring the relationship between presence and enjoyment in a virtual museum. Int. J. Hum Comput Stud. 68(5), 243–253 (2010)

    Article  Google Scholar 

  • Tait, M., Billinghurst, M.: The effect of view independence in a collaborative ar system. Comput. Support. Coop. Work (CSCW) 24(6), 563–589 (2015)

    Article  Google Scholar 

  • Tamaki, E., Chan, T., Iwasaki, K.: UnlimitedHand: input and output hand gestures with less calibration time. In: Proceedings of the 29th Annual Symposium on User Interface Software and Technology, pp. 163–165. ACM (2016)

  • Tatsumi, H., Murai, Y., Sekita, I., Tokumasu, S., Miyakawa, M.: Cane walk in the virtual reality space using virtual haptic sensing: toward developing haptic VR technologies for the visually impaired. In: 2015 IEEE International Conference on Systems, Man, and Cybernetics (SMC), pp. 2360–2365. IEEE (2015)

  • Thomas, B., et al.: ARQuake: An outdoor/indoor augmented reality first person application. In: The Fourth International Symposium on Wearable Computers, pp. 139–146. IEEE (2000)

  • Ullmer, B., Ishii, H.: Emerging frameworks for tangible user interfaces. IBM Syst. J. 39(3.4), 915–931 (2000)

    Article  Google Scholar 

  • Underkoffler, J., Ishii, H.: Urp: a luminous-tangible workbench for urban planning and design. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 386–393. ACM (1999)

  • Wang, R.Y., Popović, J.: Real-time hand-tracking with a color glove. ACM Trans. Graph. (TOG) 28(3), 63 (2009)

    Google Scholar 

  • Wang, J., et al.: Real-time computer-generated integral imaging and 3D image calibration for augmented reality surgical navigation. Comput. Med. Imaging Graph. 40, 147–159 (2015)

    Article  Google Scholar 

  • Wang, X., Ong, S., Nee, A.Y.-C.: Multi-modal augmented-reality assembly guidance based on bare-hand interface. Adv. Eng. Inform. 30(3), 406–421 (2016a)

    Article  Google Scholar 

  • Wang, X., Ong, S., Nee, A.: Real-virtual components interaction for assembly simulation and planning. Robot. Comput. Integr. Manuf. 41, 102–114 (2016b)

    Article  Google Scholar 

  • Want, R., Fishkin, K.P., Gujar, A., Harrison, B.L.: Bridging physical and virtual worlds with electronic tags. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 370–377. ACM (1999)

  • Waterworth, E.L., Waterworth, J.A.: Focus, locus, and sensus: the three dimensions of virtual experience. CyberPsychol. Behav. 4(2), 203–213 (2001)

    Article  Google Scholar 

  • Weiser, M.: The computer for the 21st century. Sci. Am. 265(3), 94–105 (1991)

    Article  Google Scholar 

  • Wigdor, D., Forlines, C., Baudisch, P., Barnwell, J., Shen, C.: Lucid touch: a see-through mobile device. In: Proceedings of the 20th Annual ACM Symposium on User Interface Software and Technology, pp. 269–278. ACM (2007)

  • Wilson, A.D., Benko, H.: Combining multiple depth cameras and projectors for interactions on, above and between surfaces. In: Proceedings of the 23rd Annual ACM Symposium on User Interface Software and Technology, pp. 273–282. ACM (2010)

  • Wojciechowski, R., Cellary, W.: Evaluation of learners’ attitude toward learning in ARIES augmented reality environments. Comput. Educ. 68, 570–585 (2013)

    Article  Google Scholar 

  • Woods, E., et al.: Augmenting the science centre and museum experience. In: Proceedings of the 2nd International Conference on Computer Graphics and Interactive Techniques in Australasia and South East Asia, pp. 230–236. ACM (2004)

  • Wozniak, P., Vauderwange, O., Mandal, A., Javahiraly, N., Curticapean, D.: Possible applications of the LEAP motion controller for more interactive simulated experiments in augmented or virtual reality. In: Optics Education and Outreach IV, vol. 9946, p. 99460P. International Society for Optics and Photonics (2016)

  • Yang, T., Xie, D., Li, Z., Zhu, H.: Recent advances in wearable tactile sensors: materials, sensing mechanisms, and device performance. Mater. Sci. Eng. R Rep. 115, 1–37 (2017)

    Article  Google Scholar 

  • Yannier, N., Koedinger, K.R., Hudson, S.E.: Learning from mixed-reality games: is shaking a tablet as effective as physical observation?. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 1045–1054. ACM (2015)

  • Yannier, N., Hudson, S.E., Wiese, E.S., Koedinger, K.R.: Adding physical objects to an interactive game improves learning and enjoyment: evidence from earthshake. ACM Trans. Comput. Hum. Interact. (TOCHI) 23(4), 26 (2016)

    Article  Google Scholar 

  • Yu, M., Lakshman, H., Girod, B.: A framework to evaluate omnidirectional video coding schemes. In: 2015 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 31–36. IEEE (2015)

  • Zheng, M., Waller, M.P.: ChemPreview: an augmented reality-based molecular interface. J. Mol. Graph. Model. 73, 18–23 (2017)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Steven Szu-Chi Chen.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chen, S.SC., Duh, H. Interface of mixed reality: from the past to the future. CCF Trans. Pervasive Comp. Interact. 1, 69–87 (2019). https://doi.org/10.1007/s42486-018-0002-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s42486-018-0002-8

Keywords

Navigation