Design and Development of Eyes- and Hands-Free Voice Interface for Mobile Phone

  • Kengo Fujita
  • Tsuneo Kato
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6776)


This paper describes the design and development process of our new eyes- and hands-free interface which provides the fundamental functions of a mobile phone by voice interaction through a Bluetooth headset. We first identify four conditions which must be met in order to make the interface acceptable to Japanese users. Next, we define design guides which address each of these conditions. In accordance with the design guides, we propose and implement the interface system. To assess the effectiveness of the proposed interface, we had participants operate a mobile phone while walking while simultaneously confirming a switching signal which either permitted or forbade them to walk. The experimental results showed that the proposed interface was more effective than the conventional interfaces for operating a mobile phone while simultaneously performing other tasks. The participants pointed out some problems during the interviews, and we address these problems.


Mobile Phone Voice Interface Eyes-free Operation Hands-free Operation Bluetooth Headset Design Process Japanese users 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    The New York Times: Why Japan’s Cellphones Haven’t Gone Global (2009),
  2. 2.
    comScore Press Release: comScore Releases First Comparative Report on Mobile Usage in Japan, United States and Europe (2010),
  3. 3.
    Communication and Information Network Association of Japan: CIAJ Releases Report on the Study of Cellular Phone Use (2010),
  4. 4.
    KDDI Corporate Information: KDDI Announces World’s First Distributed Speech Recognition Function with “Voice de Input” Carried on au Cellular Phones (2006),
  5. 5.
  6. 6.
  7. 7.
    BlueAnt V1x Voice Controlled Headset,
  8. 8.
    Ji, H., Kim, T.: CLURD: A New Character-Inputting System Using One 5-Way Key Module. In: Jacko, J.A. (ed.) HCI International 2009. LNCS, vol. 5612, pp. 39–47. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  9. 9.
    Li, K.A., Baudisch, P., Hinckley, K.: BlindsSight: Eyes-Free Access to Mobile Phones. In: 26th International Conference on Human Factors in Computing Systems, pp. 1389–1398. ACM, New York (2008)Google Scholar
  10. 10.
    Niezen, G., Hancke, G.P.: Evaluating and Optimizing Accelerometer-based Gesture Recognition Techniques for Mobile Devices. In: The 9th IEEE AFRICON, pp. 1–6. IEEE Press, New York (2009)Google Scholar
  11. 11.
    Joselli, M., Clua, E.: gRmobile: A Framework for Touch and Accelerometer Gesture Recognition for Mobile Games. In: VIII Brazilian Symposium on Games and Digital Entertainment, pp. 141–150. IEEE Press, New York (2009)CrossRefGoogle Scholar
  12. 12.
    Scott, J., Dearman, D., Yatani, K., Truong, K.N.: Sensing Foot Gestures from the Pocket. In: 23rd ACM Symposium on User Interface Software and Technology, pp. 199–208. ACM, New York (2010)Google Scholar
  13. 13.
    Yamamoto, T., Tsukamoto, M., Yoshihisa, T.: Foot-Step Input Method for Operating Information Devices While Jogging. In: 2008 International Symposium on Applications and the Internet, pp. 173–176. IEEE Press, New York (2008)CrossRefGoogle Scholar
  14. 14.
    Crossan, A., McGill, M., Brewster, S., Murray-Smith, R.: Head Tilting for Interaction in Mobile Contexts. In: 11th International Conference on Human-Computer Interaction with Mobile Devices and Services. ACM, New York (2009)Google Scholar
  15. 15.
    Brew Mobile Platform,

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Kengo Fujita
    • 1
  • Tsuneo Kato
    • 1
  1. 1.KDDI R&D Laboratories Inc.FujiminoJapan

Personalised recommendations