Advertisement

A method of character input for the user interface with a low degree of freedom

  • Shogo Matsuno
  • Susumu Chida
  • Naoaki Itakura
  • Tota Mizuno
  • Kazuyuki Mito
Original Article
  • 8 Downloads

Abstract

In recent times, smart devices equipped with small touch panels have become very popular. Many such smart devices use a software keyboard for character input. Unfortunately, software keyboards have a limitation: the number of buttons and the input degrees of freedom remain the same, because a button and an input value correspond one-to-one. Thus, if we reduce the screen size while the button size remains the same, the number of buttons must decrease. Alternatively, if we maintain the number of buttons and reduce the screen size, the size of the button decreases, making input increasingly difficult. In this study, we investigate a new character input method that is specifically adapted for use on small screens. The proposed input interface has 4 × 2 operational degrees of freedom and consists of four buttons and two actions. By handling two operations as one input, 64 input options are secured. Additionally, we experimentally evaluate the proposed character input user interface deployed on a smart device. The proposed method enables an input of approximately 25 characters per minute, and it shows robust input performance on a small screen compared to previous software methods. Thus, the proposed method is better suited to small screens than were previous methods.

Keywords

Character input Low input degree of freedom User interface layout 

References

  1. 1.
    Cisco Visual Networking Index: Forecast and Methodology, 2014–2019 White Paper. http://www.cisco.com/c/en/us/solutions/collateral/service-provider/ip-ngn-ip-next-generation-network/white_paper_c11-481360.html. Accessed 30 Apr 2017
  2. 2.
    Statistical distribution of time spent on apps on mobile devices in the United States in June 2015, by category. http://www.statista.com/statistics/248343/distribution-of-time-s pent-ios-and-android-apps-by-category. Accessed 30 June 2015
  3. 3.
    Katie A, Siek Y, Rogers, Kay H, Connelly (2005) Fat finger worries: how older and younger users physically interact with pdas. In: Proceedings the 2005 IFIP TC13 international conference Human–computer interaction (INTERACT), pp. 267–280Google Scholar
  4. 4.
    Mitsuhiro Y, Akihiko M, Mahilum JBKT (2013) Moving IME panel for text input method and Shapekey prototype. In: 2013 IEEE 2nd global conference on consumer electronics (GCCE), pp. 474–478Google Scholar
  5. 5.
    Hall A, Cunningham J, Roache R, Cox J (1988) Factors affecting performance using touch-entry systems: tactual recognition fields and systems accuracy. J Appl Psychol 73:711–720CrossRefGoogle Scholar
  6. 6.
    Hamano Y, Nishiuchi N, (2013), Usability evaluation of text input methods for smartphone among the elderly. In: 2013 International conference on biometrics and kansei engineering (ICBAKE), pp. 277–280Google Scholar
  7. 7.
    Matsuno S, Akehi K, Itakura N, Mizuno T, Mito K (2015) Computer input system using eye glances. In: 17th International conference on human–computer interaction (HCII), August, pp. 425–432Google Scholar
  8. 8.
    Akehi K, Matsuno S, Itakura N, Mizuno T, Mito K (2015) Improvement in eye glance input interface using OpenCV. In: Proceedings of the international conference on electronics and software science (ICESS), Takamatsu, Japan, pp. 207–211Google Scholar
  9. 9.
    Chida S, Matsuno S, Akehi K, Itakura N, Mizuno T, Mito K (2016) A proposal and investigation of the character input method selecting a lot of choices by a few operation degrees of freedom for smart devices. In: The papers of technical meeting on optical and quantum devices. IEE Japan, pp. 31–34Google Scholar
  10. 10.
    Chida S, Matsuno S, Itakura N, Mizuino T (2016) Input interface suitable for touch panel operation on a small screen. In: 2016 IEEE region 10 conference (TENCON), pp. 3679–3683Google Scholar
  11. 11.
    Bellman T, MacKenzie IS (1998) A probabilistic character layout strategy for mobile text entry. Proc Graph Interface 98:168–176Google Scholar
  12. 12.
    Dunlop MD (2004) Watch-top text-entry: Can phone-style predictive text entry work with only 5 buttons?. In: Proceedings MobileHCI 2004, pp. 342–346CrossRefGoogle Scholar
  13. 13.
    Dunlop MD, Masters MM (2008) Investigating five key predictive text entry with combined distance and keystroke modelling. Pervasive Ubiquitous Comput 12:589–598CrossRefGoogle Scholar
  14. 14.
    Tanaka-Ishii K, Inutsuka Y, Takeichi M (2002) Entering text with a four-button device. In: Proceedings conference on computational linguistics, vol. 1, pp. 1–7Google Scholar
  15. 15.
    Gong J, Tarasewich P, MacKenzie IS (2008) Improved word list ordering for text entry on ambiguous keyboards. In: Proceedings NordiCHI 2008, pp. 152–161Google Scholar
  16. 16.
    Hiroyuki H, Yoshitomo F, Buntarou S, Jiro T (2015) Back-of-device interaction based on the range of motion of the index finger. In: OzCHI, pp. 202–206Google Scholar
  17. 17.
    Matsuura Y, Go K (2007), Behavioral characteristic of one-handed thumb operation of small touch screen devices. Hum Interface 9(4):455–461Google Scholar
  18. 18.
    Nakic-Alfirevic T, Durek M (2004) The Dvorak keyboard layout and possibilities of its regional adaptation. In: 26th international conference on information technology interfaces, vol. 1, pp 373–378Google Scholar
  19. 19.
    Hobday SW (1996) The Maltron keyboards, In: IEE colloquium on interfaces—the leading edge (Digest No. 1996/126), PCD Maltron Ltd, East Molesey, UK, pp. 801–810Google Scholar
  20. 20.
    Bi X, Smith BA, Zhai S, (2010), Quasi-qwerty soft keyboard optimization. In: Proceedings CHI2010, pp. 283–286Google Scholar
  21. 21.
    Go K, Endo Y (2007) CATKey: customizable and adaptable touchscreen keyboard with bubble cursor-like visual feedback. In: Human–computer interaction-INTERACT 2007, Part I, pp. 493–496Google Scholar
  22. 22.
    Aulagner G et al. (2010) “FloodKey: increasing software keyboard keys by reducing needless ones without occultation”. In: Proceedings ACS’10, Stevens Point, USA, pp. 412–417Google Scholar
  23. 23.
    Edelmann J et al. (2012) Towards the keyboard of Oz: learning individual soft-keyboard models from raw optical sensor data. In: Proceedings ITS’12, pp. 163–172Google Scholar
  24. 24.
    Mackenzie I, Scott (2009) The one-key challenge: searching for a fast one-key text entry method. In: Proceedings the 11th international ACM SIGACCESS Conference on computers and accessibility. pp 91–98Google Scholar
  25. 25.
    Pickering JA (1986) Touch-sensitive screens: the technologies and their application. Int J Man Mach Stud 25(3):249–269CrossRefGoogle Scholar
  26. 26.
    Armstrong P, Wilkinson B (2016) Text entry of physical and virtual keyboards on tablets and the user perception. In: OzCHI, pp. 401–405Google Scholar

Copyright information

© International Society of Artificial Life and Robotics (ISAROB) 2018

Authors and Affiliations

  • Shogo Matsuno
    • 1
    • 3
    • 4
  • Susumu Chida
    • 2
  • Naoaki Itakura
    • 2
  • Tota Mizuno
    • 2
  • Kazuyuki Mito
    • 2
  1. 1.Toyohashi University of TechnologyToyohashiJapan
  2. 2.The University of Electro-CommunicationsChofuJapan
  3. 3.Hottolink. IncChiyodaJapan
  4. 4.Tokyo Denki UniversityAdachiJapan

Personalised recommendations