Advertisement

AmbientBrowser: Web Browser in Everyday Life

  • Satoshi Nakamura
  • Mitsuru Minakuchi
  • Katsumi Tanaka
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3864)

Abstract

Recently, due to the remarkable advancement of technology, the ubiquitous computing environment is becoming a reality. People can directly obtain information anytime from ubiquitous computer. However, conventional computing style with a keyboard and a mouse is not suitable for everyday use. We proposed and developed a Web browser called the AmbientBrowser system that supports people in their daily acquisition of knowledge. It continuously searches Web pages using both system-defined and user-defined keywords, and displays sensors detect users’ and environmental conditions and control the system’s behavior such as knowledge selection or a style of presentation. Thus, the user can encounter a wide variety of knowledge without active operations. It monitors the context of the environment, such as lighting conditions and temperature. In addition, it displays Web pages incrementally in proportion to the context. This paper describes the implementation of the AmbientBrowser system and discusses its effects.

Keywords

Target Image Weight Attribute Banner Advertisement Ubiquitous Computing Environment Keyword List 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Dahley, A., Wisnesk, C., Ishii, H.: Water Lamp and Pinwheels: Ambient Projection of Digital Information into Architectural Space. In: Proceedings of the SIGCHI Conference on Human factors in Computing Systems (CHI 1998), pp. 269–270 (1998)Google Scholar
  2. 2.
    Heiner, J.M., et al.: The Information Percolator: Ambient Information Display in a Decorative Object. In: Proceedings of CHI 1991, pp. 85–90 (1991)Google Scholar
  3. 3.
    IBM Corp., ViaVoice Dictation, ViaVoice Developers Toolkit. http://www-4.ibm.com/software/speech/
  4. 4.
    Igarashi, T., Hughes, J.F.: Voice as Sound: Using Non-verbal Voice Input for Interactive Control. In: Proceedings of 14th Annual Symposium on User Interface Software and Technology, ACM UIST 2001, Orlando, FL, pp. 155–156 (2001)Google Scholar
  5. 5.
    Ishii, H., et al.: Pinwheels: Visualizing Information Flow in an Architectural Space. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2001), pp. 111–112 (2001)Google Scholar
  6. 6.
    Luke, H.: SharpReader, http://www.sharpreader.net/
  7. 7.
    Maslow, A.H.: Motivation and Personality. Harper & Row, New York (1970)Google Scholar
  8. 8.
    McCarthy, J., Costa, T., Liongosari, E.: UniCast, OutCast & GroupCast: Three Steps Toward Ubiquitous, Peripheral Displays. In: Abowd, G.D., Brumitt, B., Shafer, S. (eds.) UbiComp 2001. LNCS, vol. 2201, pp. 332–345. Springer, Heidelberg (2001)CrossRefGoogle Scholar
  9. 9.
    NEC Corporation, "SmartVoice", http://www.amuseplus.com/product/voice/
  10. 10.
  11. 11.
    Nakamura, S., Minakuchi, M., Tanaka, K.: AmbientBrowser: Web Browser in Life, Ambient Intelligence and (Everyday) Life, pp. 83–92 (July 2005)Google Scholar
  12. 12.
    Nakamura, S., Minakuchi, M., Tanaka, K.: EnergyBrowser: Walking in the World Wide Web. In: 11th International Conference on Human-Computer Interaction (HCII 2005), vol. 2(42) (July 2005)Google Scholar
  13. 13.
    Nakamura, S., Minakuchi, M., Tanaka, K.: Energy Browser: To Make Exercise Enjoyable and Interesting. In: ACM SIGCHI International Conference on Advances in Computer Entertainment Technology (ACE 2005), pp. 258–261 (June 2005)Google Scholar
  14. 14.
    Starner, T., Auxier, J., Ashbrook, D., Gandy, M.: The Gesture Pendant: A Self-illuminating, Wearable, InfraredComputer Vision system for Home Automation Control andMedical Monitoring. In: Proc. ISWC 2000, International Symposium on Wearable Computers (ISWC 2000), pp. 87–94 (2000)Google Scholar
  15. 15.
    Stasko, J., Miller, T., Pousman, Z., Plaue, C., Ullah, O.: Personalized Peripheral Information Awareness through Information Art. In: Davies, N., Mynatt, E.D., Siio, I. (eds.) UbiComp 2004. LNCS, vol. 3205, pp. 18–35. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  16. 16.
    Tanaka, K., Nadamoto, A., Kusahara, M., Hattori, T., Kondo, H., Sumiya, K.: Back to the TV: Information Visualization Interfaces Based on TV-Program Metaphors. In: Proceedings of the IEEE ICME 2000, pp. 1229–1232 (2000)Google Scholar
  17. 17.
    Tsukada, K., Yasumura, M.: Ubi-Finger: Gesture Input Device for Mobile Use. In: Proceedings of APCHI 2002, vol. 1, pp. 388–400 (2002)Google Scholar
  18. 18.
    Tsukada, K., Yasumura, M.: Ubi-Finger: Gesture Input Device for Mobile Use, Companion Proceedings of Ubicomp 2001, Technical Report: GIT-GVU-TR-01-7 (2001)Google Scholar
  19. 19.
    Watanabe, K., Yasumura, M.: Memorium: The Concept of a Persistent Interface and its Prototype. In: WISS 2002 (2002) (in Japanese)Google Scholar
  20. 20.
    Weiser, M.: The Computer for the Twenty-first Century, Scientific American, pp. 94–104 (1991)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Satoshi Nakamura
    • 1
  • Mitsuru Minakuchi
    • 1
  • Katsumi Tanaka
    • 1
    • 2
  1. 1.National Institute of Information and Communications, JapanKyotoJapan
  2. 2.Graduate School of InformaticsKyoto UniversityKyotoJapan

Personalised recommendations