Abstract
In this paper we present a framework for attaching information to physical objects in a way that can be interactively browsed and searched in a hands-free, multi-modal, and personalized manner that leverages users’ natural looking, pointing and reaching behaviors. The system uses small infrared transponders on objects in the environment and worn by the user to achieve dense, on-object visual feedback usually possible only in augmented reality systems, while improving on interaction style and requirements for wearable gear. We discuss two applications that have been implemented, a tutorial about the parts of an automobile engine and a personalized supermarket assistant. The paper continues with a user study investigating browsing and searching behaviors in the supermarket scenario, and concludes with a discussion of findings and future work.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Ishii, H., Ullmer, B.: Tangible bits: Towards seamless interfaces between people, bits, and atoms. In: Proceedings of CHI’97, pp. 234–241 (1997)
Google: Google maps for mobile (2006), http://google.com/gmm/
Rekimoto, J.: Gesturewrist and gesturepad: Unobtrusive wearable interaction devices. In: 5th IEEE International Symposium on Wearable Computers (2001)
Want, R., et al.: The active badge location system. ACM Transactions on Information Systems 10(1) (1992)
Want, R., et al.: An overview of the PARCTAB ubiquitous computing experiment. IEEE Personal Communications 2(6), 28–33 (1995)
Want, R., et al.: Bridging physical and virtual worlds with electronic tags. In: Proceedings of CHI ’99, pp. 370–377. ACM Press, New York (1999)
Kindberg, T., et al.: People, places, things: Web presence for the real world. In: IEEE Workshop Mobile Computing Systems and Applications (WMCSA 2000), pp. 19–28. IEEE Computer Society Press, Los Alamitos (2000)
Smith, M., Davenport, D., Hwa, H.: Aura: A mobile platform for object and location annotation. In: The Fifth International Conference on Ubiquitous Computing, Seattle, WA (2003)
Rekimoto, J., Abowd, G.D., Patel, S.N.: iCam: Precise at-a-Distance Interaction in the Physical Environment. In: Fishkin, K.P., et al. (eds.) PERVASIVE 2006. LNCS, vol. 3968, pp. 272–287. Springer, Heidelberg (2006)
Abowd, G.D., Patel, S.N.: A 2-Way Laser-Assisted Selection Scheme for Handhelds in a Physical Environment. In: Dey, A.K., Schmidt, A., McCarthy, J.F. (eds.) UbiComp 2003. LNCS, vol. 2864, pp. 200–207. Springer, Heidelberg (2003)
Rohs, M., Gfeller, B.: Using camera-equipped mobile phones for interacting with real-world objects. In: Advances in Pervasive Computing, pp. 265–271 (2004)
Abowd, G.D., et al.: Cyberguide: A mobile context-aware tour guide. ACM Wireless Networks 3, 421–433 (1997)
Cheverst, K., et al.: Developing a context-aware electronic tourist guide: some issues and experiences. In: CHI, pp. 17–24 (2000)
Oppermann, R., Specht, M.: A Context-Sensitive Nomadic Exhibition Guide. In: Thomas, P., Gellersen, H.-W. (eds.) HUC 2000. LNCS, vol. 1927, Springer, Heidelberg (2000)
Paradiso, J.A., Ma, H.: The FindIT Flashlight: Responsive Tagging Based on Optically Triggered Microprocessor Wakeup. In: Borriello, G., Holmquist, L.E. (eds.) UbiComp 2002. LNCS, vol. 2498, Springer, Heidelberg (2002)
Azuma, R., et al.: Recent advances in augmented reality. IEEE Computer Graphics and Applications 21(6), 34–47 (2001)
Rekimoto, J., Ayatsuka, Y.: Cybercode:designing augmented reality environments with visual tags. In: Proc. Designing Augmented Reality Environments, pp. 1–10 (2000)
Rekimoto, J., Nagao, K.: The world through the computer: computer augmented interaction with real world environments. In: Proceedings of the 8th annual ACM symposium on User interface and software technology, Pittsburgh, Pennsylvania, United States, ACM Press, New York (1995), http://portal.acm.org/citation.cfm?id=215639
Hoellerer, T., et al.: Exploring mars: developing indoor and outdoor user interfaces to a mobile augmented reality system. Computers & Graphics 23(6), 779–785 (1999)
Feiner, S.K.: Augmented reality: A new way of seeing. Scientific American (April 2002)
Leibe, B., et al.: The perceptive workbench: Towards spontaneous and natural interaction in semi-immersive virtual environments. In: Proceedings of the IEEE Conference on Virtual Reality, pp. 13–20 (2000)
Patten, J., et al.: Sensetable: A wireless object tracking platform for tangible user interfaces. In: Proceedings of CHI’01, Seattle, WA, pp. 253–260. ACM Press, New York (2001)
Shell, J., Selker, T., Vertegaal, R.: Interacting with groups of computers. Communications of the ACM 46(3), 40–46 (2003)
Stiefelhagen, R., Yang, J., Waibel, A.: Estimating focus of attention based on gaze and sound. In: Workshop on Perceptive User Interfaces (PUI’01) (2001)
Oh, A., et al.: Evaluating look-to-talk: a gaze-aware interface in a collaborative environment. In: Conference on Human Factors in Computing Systems, pp. 650–651 (2002)
Smith, J., Vertegaal, R., Sohn, C.: Viewpointer: lightweight calibration-free eye tracking for ubiquitous handsfree deixis. In: UIST, pp. 53–61 (2005)
Shell, J.S., et al.: Ecsglasses and eyepliances: using attention to open sociable windows of interaction. In: Proceedings of the 2004 symposium on Eye tracking research & applications, pp. 93–100 (2004)
Silicon labs (2006), http://www.silabs.com/
Sourceforge: Cmu sphinx (2006), http://sourceforge.net/projects/cmusphinx
Sourceforge: Freetts 1.2 - a speech synthesizer written entirely in the javaTMprogramming language (2006), http://freetts.sourceforge.net/docs/
Barbara, J.G., Candace, L.S.: Attention, intentions, and the structure of discourse. Comput. Linguist. 12(3), 175–204 (1986)
Roy, D.: Grounding words in perception and action: Insights from computational models. Trends in Cognitive Science 9(8), 389–396 (2005)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer Berlin Heidelberg
About this paper
Cite this paper
Merrill, D., Maes, P. (2007). Augmenting Looking, Pointing and Reaching Gestures to Enhance the Searching and Browsing of Physical Objects. In: LaMarca, A., Langheinrich, M., Truong, K.N. (eds) Pervasive Computing. Pervasive 2007. Lecture Notes in Computer Science, vol 4480. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-72037-9_1
Download citation
DOI: https://doi.org/10.1007/978-3-540-72037-9_1
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-72036-2
Online ISBN: 978-3-540-72037-9
eBook Packages: Computer ScienceComputer Science (R0)