Emerging Human-Toy Interaction Techniques with Augmented and Mixed Reality

Part of the International Series on Computer Entertainment and Media Technology book series (ISCEMT)


In this book chapter, we will review the emerging technologies that promote extensive interactions between toys and their players. Cutting-edge display technologies have taken a major role in human-toy interaction. In particular, reality-virtuality technologies such as Augmented Reality (AR) and Mixed Reality (MR) have been adopted in digital entertainment and physical toys. With advances of portable and wearable devices, reality-virtuality technologies have become more popular and immersed into our daily life. Various interaction techniques are identified such as depth sensors and haptic devices. We will introduce example technologies, devices and products of the above technologies. However, displayed/projected virtual objects cannot give the user a sense of touch. Additional apparatus such as haptic styli are necessary to interact with the virtual objects. We foresee that in the near future, more kinds of virtual senses (such as taste) could be simulated and become part of the toy.


Augmented Reality Interaction Techniques Feedback Virtual Senses Toy Computing 


  1. Google 2014. Project GLASS. Google Inc.
  2. Milgram, P., Kishino, F. (1994). A taxonomy of mixed reality visual displays, IEICE Transactions on Information and Systems Special Issue on Networked Reality (E77D), 12, 1321–1329.Google Scholar
  3. Van Dam, Andries, et al. “Immersive VR for scientific visualization: A progress report.” Computer Graphics and Applications, IEEE 20.6 (2000): 26–52.CrossRefMathSciNetGoogle Scholar
  4. H. Regenbrecht and C. Ott and M. Wagner and T. Lum and P. Kohler and W. Wilke and E. Mueller, An Augmented Virtuality Approach to 3D Videoconferencing, Proceedings of The 2nd IEEE and ACM International Symposium on Mixed and Augmented Reality, pp 290–291, 2003Google Scholar
  5. Azuma, Ronald; Balliot, Yohan; Behringer, Reinhold; Feiner, Steven; Julier, Simon; MacIntyre, Blair. Recent Advances in Augmented Reality Computers & Graphics, November 2001.Google Scholar
  6. Hiroshi Ishii,“The tangible user interface and its evolution”. Communications of the ACM, Volume 51 Issue 6, June 2008.Google Scholar
  7. Cruz-Neira, Carolina, Daniel J. Sandin, and Thomas A. DeFanti. “Surround-screen projection-based virtual reality: the design and implementation of the CAVE.”Proceedings of the 20th annual conference on Computer graphics and interactive techniques.ACM, 1993.Google Scholar
  8. Oculus VR. (2014). Oculus Rift—Virtual Reality Headset for 3D Gaming, Oculus VR,
  9. Kelion, Leo. (4 March 2015) “Sony's Morpheus virtual reality helmet set for 2016 launch”. BBC.Google Scholar
  10. James Cook, People Are Complaining About The Anime Schoolgirl Sony Used To Demo Its VR Headset Read. Business Insider (1 Sep 2014)
  11. Game News (2013). “Oculus Rift: Developers Game Jam organized for new games projects”, I Love Game Reviews
  12. Chan, J.C.P., Leung, H., Tang, J.K.T and Komura, T. (2011). A Virtual Reality Dance Training System Using Motion Capture Technology, IEEE Transactions on Learning Technologies, 4(2), pp 187–195.CrossRefGoogle Scholar
  13. Kinect Games, xbox 360 + Kinect. 2015.
  14. Billinghurst, M., Grasset, R., and Looser, J. (2005). Designing augmented reality interfaces, ACM SIGGRAPH Computer Graphics - Learning through computer-generated visualization, 39 (1), 17–22.Google Scholar
  15. AppToyz Blaster. 2014.
  16. Getting started—ARToolworks support library. 2014.
  17. Kato, H., Billinghurst, M. “Marker tracking and hmd calibration for a video-based augmented reality conferencing system.”, In Proceedings of the 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR 99), October 1999.Google Scholar
  18. NyARToolkit project:
  19. Geisha Tokyo Entertainment, Inc. (2009) ARis,
  20. Lasorne, F. (2009) Augmented Reality Toy,
  21. Lowe, D. G., “Distinctive Image Features from Scale-Invariant Keypoints”, International Journal of Computer Vision, 60, 2, pp. 91–110, 2004.CrossRefGoogle Scholar
  22. J. Matas, O.Chum, M.Urban, and T.Pajdla. “Robust wide baseline stereo from maximally stable extremal regions.” Proc. of British Machine Vision Conference, pages 384–396, 2002.Google Scholar
  23. Greg Eddington, Markerless Augmented Reality,
  24. Mitchlehan Media, LLC (2013), AR Flashcards,
  25. Lindsay ONeal. (7 Dec. 2011). Suwappu App Gives Toys Their Own 3-D World. Techli. Retrieved from
  26. Ullmer, B., and Ishii, H. (2001). Emerging Frameworks for Tangible User Interfaces. In Human-Computer Interaction in the New Millenium, John M. Carroll, ed.Boston: Addison-Wesley, 2001, pp. 579–601.Google Scholar
  27. Ichida H., Itoh Y., Kitamura Y., Kishino F. (2004). ActiveCube and its 3D Applications, IEEE VR 2004, Chicago, IL, USA.Google Scholar
  28. SoftEther (2011), QUMARION,
  29. Kidtellect. (2012). Tiggly - Interactive toys and iPad learning apps for toddlers and preschoolers, Kidtellect Inc.,
  30. Disney/Pixar. (2012). AppMATes,
  31. Sutherland, Ivan E.1968. “A Head-Mounted Three Dimensional Display,” pp. 757–764 in Proceedings of the Fall Joint Computer Conference. AFIPS Press, Montvale, N.J.Google Scholar
  32. Martin Missfeldt. (February 2013). Google Glass (infographic) - How it works. Retrieved from
  33. Tait, M. (2015). NyARToolKit on Glass, Human Interface Technology Laboratory New Zealand.
  34. Hsu, T. (2010). Japanese pop star Hatsune Miku takes the stage—as a 3-D hologram, Technology (10 Nov., 2010), The Business and Culture of our Digital Lives, from the L.A. Times, Retrieved from:
  35. Wilks, J. (2011). Hatsune Miku: Live in concert? One man’s experience with a digital idol (3 Mar., 2011).TimeOut Tokyo.Google Scholar
  36. Lau, M., Mitani, J., and Igarashi, T. (2012). Digital Fabrication, IEEE Computer, 45(12), pp. 76–79.CrossRefGoogle Scholar
  37. Tang J.K.T. (2015). “Augmented Blocks: A Natural 3D Model Creation Interface”, Technical Report, Caritas Institute of Higher Education, Hong Kong. Retrieved from Scholar
  38. Shahram Izadi, David Kim, Otmar Hilliges, David Molyneaux, Richard Newcombe, Pushmeet Kohli, Jamie Shotton, Steve Hodges, Dustin Freeman, Andrew Davison and Andrew Fitzgibbon. 2011. KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera. In Proceedings of the 24th annual ACM symposium on User interface software and technology (UIST '11). ACM, New York, NY , USA, 559–568.Google Scholar
  39. Tang, J.K.T., Lau, W.M., Chan, K.K. and To, K.H. (2014) AR Interior Designer: Automatic Furniture Arrangement using Spatial and Functional Relationships. In Proceedings of VSMM 2014 International Conference. IEEE Explore Conference Publications, (978-1-4799-7227-2/14 ©2014 IEEE).Google Scholar
  40. iSenseTM 3D Scanner (2014), 3D Systems, Inc.,
  41. Mark Tyson (2013). Toshiba announces dual-lens ‘depth camera’ module for mobile. (27 Sep., 2013). Retrieved from:
  42. Kim, H., Takahashi, I., Yamamoto, H., Kai, T., Maekawa, S. and Naemura, T. (2013). MARIO: Mid-Air Augmented Reality Interaction with Objects, Advances in Computer Entertainment, Lecture Notes in Computer Science, Volume 8253, pp 560–563.CrossRefGoogle Scholar
  43. Kurihara, Y., Hachisu, T., Kuchenbecker K.J., Kajimoto, H. (2013). Jointonation: Robotization of the Human Body by Vibrotactile Feedback. ACM SIGGRAPH Asia 2013 Emerging Technologies, November 19–22, 2013, Hong Kong.Google Scholar
  44. Matthias Harders. (2013). Personal website - Research. Retrieved from
  45. A. Israr, Seung-Chan Kim, J. Stec, and I. Poupyrev. “Surround haptics: tactile feedback for immersive gaming experiences,” In Proceedings of the 2012 ACM annual conference extended abstracts on Human Factors in Computing Systems Extended Abstracts, pp. 1087–1090. ACM, 2012.Google Scholar
  46. P. Kortum, HCI Beyond the GUI: Design for Haptic, Speech, Olfactory and Other Non-Traditional Interfaces. Burlington, MA: Morgan Kaufmann, 2008, pp. 291–306.Google Scholar
  47. D. Maynes-aminzade, “Edible bits: seamless interfaces between people, data, and food,” in Proceedings of the 2005 ACM Conference on Human Factors in Computing Systems, CHI 2005, Portland, OR, pp. 2207–2210, April 2–7, 2005.Google Scholar
  48. Takuji Narumi, Munehiko Sato, Tomohiro Tanikawa, and Michitaka Hirose. 2010. Evaluating crosssensory perception of superimposing virtual color onto real drink: toward realization of pseudogustatory displays. In Proceedings of the 1st Augmented Human International Conference (AH ‘10). ACM, New York, NY, USA, Article 18, 6 pages.Google Scholar
  49. “Variable candy sensations using augmented reality—TagCandy”, DigiInfoTV, December 1, 2010. [Online] Avaiable: [Accessed Nov. 21, 2012].
  50. T. Narumi, T. Kajinami, T. Tanikawa, and M. Hirose, “Meta cookie,” in ACM Siggraph 2010 Emerging Technologies, Siggraph 2010, Los Angeles, CA, pp. 143, July 25–29, 2010.Google Scholar
  51. A. Volta, “On the electricity excited by the mere contact of conducting substances of difference kinds,” in Abstracts of the Papers Printed in the Philosophical Transactions of the Royal Society of London, vol 1, pp. 27–29. 1800.Google Scholar
  52. J.A. Stillman, R.P. Morton, K.D. Hay, Z. Ahmad, and D. Goldsmith, “Electrogustometry: strength, weaknesses, and clinical evidence of stimulus boundaries,” Clinical Otolaryngology & Allied Sciences. Vol. 28. no.5. pp. 406–410, October 2003.Google Scholar
  53. H. Nakamura, and H. Miyashita, “Augmented gestation using electricity,” in Fourth Augmented Human International Conference, AH'11. pp. 34, Tokyo, Japan, March 12–14, 2011Google Scholar
  54. Ranasinghe, N., Nakatsu, R., Nii, H., &Gopalakrishnakone, P. (2012, June). Tongue mounted interface for digitally actuating the sense of taste. In Wearable Computers (ISWC), 2012 16th International Symposium on (pp. 80–87). IEEE.Google Scholar
  55. Ranasinghe, N., Lee, K. Y., &Do, E. Y. L. (2014, February). FunRasa: an interactive drinking platform. In Proceedings of the 8th International Conference on Tangible, Embedded and Embodied Interaction (pp. 133–136). ACM.Google Scholar
  56. “A Sock in the Nose. Review, Behind the Great Wall,” Time Magazine. p. 57. Dec. 21 1959.Google Scholar
  57. Y.Bannai, D.Noguchi, K.Okada, and S.Sugimoto, “Ink jet olfactory display enabling instantaneous switches of scents,” Proceedings of the international conference on Multimedia, MM'10. pp. 301–310. New York. October 2010.Google Scholar
  58. “How internet odors will work,” HowStuffWorks, Jan. 5, 2012. [Online] Available: [Accessed Nov. 20, 2012].
  59. https://scentee.comGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.School of Computing and Information SciencesCaritas Institute of Higher EducationHong KongChina
  2. 2.School of Mathematics, Computer Science & EngineeringCity University LondonCity University LondonUnited Kingdom

Personalised recommendations