Skip to main content

Emerging Human-Toy Interaction Techniques with Augmented and Mixed Reality

Part of the International Series on Computer Entertainment and Media Technology book series (ISCEMT)


In this book chapter, we will review the emerging technologies that promote extensive interactions between toys and their players. Cutting-edge display technologies have taken a major role in human-toy interaction. In particular, reality-virtuality technologies such as Augmented Reality (AR) and Mixed Reality (MR) have been adopted in digital entertainment and physical toys. With advances of portable and wearable devices, reality-virtuality technologies have become more popular and immersed into our daily life. Various interaction techniques are identified such as depth sensors and haptic devices. We will introduce example technologies, devices and products of the above technologies. However, displayed/projected virtual objects cannot give the user a sense of touch. Additional apparatus such as haptic styli are necessary to interact with the virtual objects. We foresee that in the near future, more kinds of virtual senses (such as taste) could be simulated and become part of the toy.


  • Augmented Reality
  • Interaction Techniques
  • Feedback
  • Virtual Senses
  • Toy Computing

This is a preview of subscription content, access via your institution.

Buying options

USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-319-21323-1_5
  • Chapter length: 29 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
USD   84.99
Price excludes VAT (USA)
  • ISBN: 978-3-319-21323-1
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Softcover Book
USD   109.00
Price excludes VAT (USA)
Hardcover Book
USD   109.99
Price excludes VAT (USA)
Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22
Fig. 23
Fig. 24
Fig. 25
Fig. 26
Fig. 27
Fig. 28
Fig. 29
Fig. 30
Fig. 31
Fig. 32
Fig. 33
Fig. 34
Fig. 35
Fig. 36
Fig. 37
Fig. 38
Fig. 39


  1. 1.

    Go Sphero. (29th Sep. 2011). Mixed-Reality with Sphero [Video file]. Retrieved from.

  2. 2.

    Giulio Di Vico. (3rd March 2014). Ivan Sutherland - Head Mounted Display [Video file]. Retrieved from

  3. 3.

    H. Kim, I. Takahashi, H. Yamamoto, T. Kai, S.Maekawa, and T.Naemura. (2014). MARIO: Mid-air Augmented Reality Interaction with Objects [Video file]. Retrieved from


  • Google 2014. Project GLASS. Google Inc.

  • HoloLens, Microsoft.

  • Milgram, P., Kishino, F. (1994). A taxonomy of mixed reality visual displays, IEICE Transactions on Information and Systems Special Issue on Networked Reality (E77D), 12, 1321–1329.

    Google Scholar 

  • Van Dam, Andries, et al. “Immersive VR for scientific visualization: A progress report.” Computer Graphics and Applications, IEEE 20.6 (2000): 26–52.

    CrossRef  MathSciNet  Google Scholar 

  • H. Regenbrecht and C. Ott and M. Wagner and T. Lum and P. Kohler and W. Wilke and E. Mueller, An Augmented Virtuality Approach to 3D Videoconferencing, Proceedings of The 2nd IEEE and ACM International Symposium on Mixed and Augmented Reality, pp 290–291, 2003

    Google Scholar 

  • Azuma, Ronald; Balliot, Yohan; Behringer, Reinhold; Feiner, Steven; Julier, Simon; MacIntyre, Blair. Recent Advances in Augmented Reality Computers & Graphics, November 2001.

    Google Scholar 

  • MR

    Google Scholar 

  • Hiroshi Ishii,“The tangible user interface and its evolution”. Communications of the ACM, Volume 51 Issue 6, June 2008.

    Google Scholar 

  • Cruz-Neira, Carolina, Daniel J. Sandin, and Thomas A. DeFanti. “Surround-screen projection-based virtual reality: the design and implementation of the CAVE.”Proceedings of the 20th annual conference on Computer graphics and interactive techniques.ACM, 1993.

    Google Scholar 

  • Oculus VR. (2014). Oculus Rift—Virtual Reality Headset for 3D Gaming, Oculus VR,

  • Kelion, Leo. (4 March 2015) “Sony's Morpheus virtual reality helmet set for 2016 launch”. BBC.

    Google Scholar 

  • James Cook, People Are Complaining About The Anime Schoolgirl Sony Used To Demo Its VR Headset Read. Business Insider (1 Sep 2014)

  • David Kolb, Sprawling Places (2008).

  • Game News (2013). “Oculus Rift: Developers Game Jam organized for new games projects”, I Love Game Reviews

  • Chan, J.C.P., Leung, H., Tang, J.K.T and Komura, T. (2011). A Virtual Reality Dance Training System Using Motion Capture Technology, IEEE Transactions on Learning Technologies, 4(2), pp 187–195.

    CrossRef  Google Scholar 

  • Kinect for Windows. 2015.

  • Kinect Games, xbox 360 + Kinect. 2015.

  • Billinghurst, M., Grasset, R., and Looser, J. (2005). Designing augmented reality interfaces, ACM SIGGRAPH Computer Graphics - Learning through computer-generated visualization, 39 (1), 17–22.

    Google Scholar 

  • AppToyz Blaster. 2014.

  • Getting started—ARToolworks support library. 2014.

  • AR Toolkit. (2002).

  • Kato, H., Billinghurst, M. “Marker tracking and hmd calibration for a video-based augmented reality conferencing system.”, In Proceedings of the 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR 99), October 1999.

    Google Scholar 

  • NyARToolkit project:

  • Qualcomm Vuforia,

  • Geisha Tokyo Entertainment, Inc. (2009) ARis,

  • Lasorne, F. (2009) Augmented Reality Toy,

  • Lowe, D. G., “Distinctive Image Features from Scale-Invariant Keypoints”, International Journal of Computer Vision, 60, 2, pp. 91–110, 2004.

    CrossRef  Google Scholar 

  • J. Matas, O.Chum, M.Urban, and T.Pajdla. “Robust wide baseline stereo from maximally stable extremal regions.” Proc. of British Machine Vision Conference, pages 384–396, 2002.

    Google Scholar 

  • Greg Eddington, Markerless Augmented Reality,

  • Mitchlehan Media, LLC (2013), AR Flashcards,

  • AR Dinosaur (2014). ARDINOSAUR

  • Lindsay ONeal. (7 Dec. 2011). Suwappu App Gives Toys Their Own 3-D World. Techli. Retrieved from

  • Mixed Reality Pong with Sphero. 2015.

  • Ullmer, B., and Ishii, H. (2001). Emerging Frameworks for Tangible User Interfaces. In Human-Computer Interaction in the New Millenium, John M. Carroll, ed.Boston: Addison-Wesley, 2001, pp. 579–601.

    Google Scholar 

  • Ichida H., Itoh Y., Kitamura Y., Kishino F. (2004). ActiveCube and its 3D Applications, IEEE VR 2004, Chicago, IL, USA.

    Google Scholar 

  • SoftEther (2011), QUMARION,

  • Kidtellect. (2012). Tiggly - Interactive toys and iPad learning apps for toddlers and preschoolers, Kidtellect Inc.,

  • Disney/Pixar. (2012). AppMATes,

  • Sutherland, Ivan E.1968. “A Head-Mounted Three Dimensional Display,” pp. 757–764 in Proceedings of the Fall Joint Computer Conference. AFIPS Press, Montvale, N.J.

    Google Scholar 

  • Martin Missfeldt. (February 2013). Google Glass (infographic) - How it works. Retrieved from

  • Tait, M. (2015). NyARToolKit on Glass, Human Interface Technology Laboratory New Zealand.

  • Hsu, T. (2010). Japanese pop star Hatsune Miku takes the stage—as a 3-D hologram, Technology (10 Nov., 2010), The Business and Culture of our Digital Lives, from the L.A. Times, Retrieved from:

  • Wilks, J. (2011). Hatsune Miku: Live in concert? One man’s experience with a digital idol (3 Mar., 2011).TimeOut Tokyo.

    Google Scholar 

  • Lau, M., Mitani, J., and Igarashi, T. (2012). Digital Fabrication, IEEE Computer, 45(12), pp. 76–79.

    CrossRef  Google Scholar 

  • Tang J.K.T. (2015). “Augmented Blocks: A Natural 3D Model Creation Interface”, Technical Report, Caritas Institute of Higher Education, Hong Kong. Retrieved from

    Google Scholar 

  • Shahram Izadi, David Kim, Otmar Hilliges, David Molyneaux, Richard Newcombe, Pushmeet Kohli, Jamie Shotton, Steve Hodges, Dustin Freeman, Andrew Davison and Andrew Fitzgibbon. 2011. KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera. In Proceedings of the 24th annual ACM symposium on User interface software and technology (UIST '11). ACM, New York, NY , USA, 559–568.

    Google Scholar 

  • Tang, J.K.T., Lau, W.M., Chan, K.K. and To, K.H. (2014) AR Interior Designer: Automatic Furniture Arrangement using Spatial and Functional Relationships. In Proceedings of VSMM 2014 International Conference. IEEE Explore Conference Publications, (978-1-4799-7227-2/14 ©2014 IEEE).

    Google Scholar 

  • iSenseTM 3D Scanner (2014), 3D Systems, Inc.,

  • Mark Tyson (2013). Toshiba announces dual-lens ‘depth camera’ module for mobile. (27 Sep., 2013). Retrieved from:

  • Kim, H., Takahashi, I., Yamamoto, H., Kai, T., Maekawa, S. and Naemura, T. (2013). MARIO: Mid-Air Augmented Reality Interaction with Objects, Advances in Computer Entertainment, Lecture Notes in Computer Science, Volume 8253, pp 560–563.

    CrossRef  Google Scholar 

  • Kurihara, Y., Hachisu, T., Kuchenbecker K.J., Kajimoto, H. (2013). Jointonation: Robotization of the Human Body by Vibrotactile Feedback. ACM SIGGRAPH Asia 2013 Emerging Technologies, November 19–22, 2013, Hong Kong.

    Google Scholar 

  • Matthias Harders. (2013). Personal website - Research. Retrieved from

  • A. Israr, Seung-Chan Kim, J. Stec, and I. Poupyrev. “Surround haptics: tactile feedback for immersive gaming experiences,” In Proceedings of the 2012 ACM annual conference extended abstracts on Human Factors in Computing Systems Extended Abstracts, pp. 1087–1090. ACM, 2012.

    Google Scholar 

  • P. Kortum, HCI Beyond the GUI: Design for Haptic, Speech, Olfactory and Other Non-Traditional Interfaces. Burlington, MA: Morgan Kaufmann, 2008, pp. 291–306.

    Google Scholar 

  • D. Maynes-aminzade, “Edible bits: seamless interfaces between people, data, and food,” in Proceedings of the 2005 ACM Conference on Human Factors in Computing Systems, CHI 2005, Portland, OR, pp. 2207–2210, April 2–7, 2005.

    Google Scholar 

  • Takuji Narumi, Munehiko Sato, Tomohiro Tanikawa, and Michitaka Hirose. 2010. Evaluating crosssensory perception of superimposing virtual color onto real drink: toward realization of pseudogustatory displays. In Proceedings of the 1st Augmented Human International Conference (AH ‘10). ACM, New York, NY, USA, Article 18, 6 pages.

    Google Scholar 

  • “Variable candy sensations using augmented reality—TagCandy”, DigiInfoTV, December 1, 2010. [Online] Avaiable: [Accessed Nov. 21, 2012].

  • T. Narumi, T. Kajinami, T. Tanikawa, and M. Hirose, “Meta cookie,” in ACM Siggraph 2010 Emerging Technologies, Siggraph 2010, Los Angeles, CA, pp. 143, July 25–29, 2010.

    Google Scholar 

  • A. Volta, “On the electricity excited by the mere contact of conducting substances of difference kinds,” in Abstracts of the Papers Printed in the Philosophical Transactions of the Royal Society of London, vol 1, pp. 27–29. 1800.

    Google Scholar 

  • J.A. Stillman, R.P. Morton, K.D. Hay, Z. Ahmad, and D. Goldsmith, “Electrogustometry: strength, weaknesses, and clinical evidence of stimulus boundaries,” Clinical Otolaryngology & Allied Sciences. Vol. 28. no.5. pp. 406–410, October 2003.

    Google Scholar 

  • H. Nakamura, and H. Miyashita, “Augmented gestation using electricity,” in Fourth Augmented Human International Conference, AH'11. pp. 34, Tokyo, Japan, March 12–14, 2011

    Google Scholar 

  • Ranasinghe, N., Nakatsu, R., Nii, H., &Gopalakrishnakone, P. (2012, June). Tongue mounted interface for digitally actuating the sense of taste. In Wearable Computers (ISWC), 2012 16th International Symposium on (pp. 80–87). IEEE.

    Google Scholar 

  • Ranasinghe, N., Lee, K. Y., &Do, E. Y. L. (2014, February). FunRasa: an interactive drinking platform. In Proceedings of the 8th International Conference on Tangible, Embedded and Embodied Interaction (pp. 133–136). ACM.

    Google Scholar 

  • “A Sock in the Nose. Review, Behind the Great Wall,” Time Magazine. p. 57. Dec. 21 1959.

    Google Scholar 

  • Y.Bannai, D.Noguchi, K.Okada, and S.Sugimoto, “Ink jet olfactory display enabling instantaneous switches of scents,” Proceedings of the international conference on Multimedia, MM'10. pp. 301–310. New York. October 2010.

    Google Scholar 

  • “How internet odors will work,” HowStuffWorks, Jan. 5, 2012. [Online] Available: [Accessed Nov. 20, 2012].


    Google Scholar 


Download references

Author information

Authors and Affiliations


Corresponding author

Correspondence to Jeff K. T. Tang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Tang, J., Tewell, J. (2015). Emerging Human-Toy Interaction Techniques with Augmented and Mixed Reality. In: Hung, P. (eds) Mobile Services for Toy Computing. International Series on Computer Entertainment and Media Technology. Springer, Cham.

Download citation

  • DOI:

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-21322-4

  • Online ISBN: 978-3-319-21323-1

  • eBook Packages: Computer ScienceComputer Science (R0)