Usability Design of a Scanning Interface for a Robot Used by Disabled Users

  • Anthony S. White
  • Stephen Prior
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4554)


The results of examining a scanning user interface implementation with command inputs in the form of head gestures for a rehabilitation robot using Fitts’ law variations and comparing these with a servo eye tracking model are made. Calculations show that the movement time prediction is more accurate in this case using the servo eye model. The response from the linearised eye model predicts that there is a minimum scanning distance that can be used and minimum spacing between commands display.


scanning user interface Servo-eye-model Fitts’ law rehabilitation robotics gestures 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Parsons, B.N: The design and Evaluation of an interface and control system for a scariculated rehabilitation robot arm. PhD thesis, School of Engineering Systems, Middlesex University (2001)Google Scholar
  2. 2.
    Keates, S., Clarkson, P.J., Robinson, P.: Developing a methodology for the design of accessible interfaces. In: Proc. 4th ERCIM workshop, pp. 1–15 (1998)Google Scholar
  3. 3.
    Lesher, G.W., Higginbotham, D.J., Moulton, B.J.: Techniques for automatically updating scanning delays. In: Proc. RESNA Conf., pp. 85–87 (2000)Google Scholar
  4. 4.
    Parsons, B., White, A., Warner, P., Gill, R.: Validation methods for an accessible user interface for a rehabilitation robot. UAIS 5(3), 306–324 (2006)CrossRefGoogle Scholar
  5. 5.
    MacKenzie, I.S.: Motor Behavior Models for Human-Computer interaction Chapter 3 in HCI Models. In: Carroll, J.M. (ed.) Theories and Frameworks, pp. 35–40. Morgan Kaufmann, San Francisco (2003)Google Scholar
  6. 6.
    Langolf, G.D., Chaffin, D.B., Foulke, J.A.: An Investigation of Fitts’ Law using a wide range of movement amplitudes. J. Motor Behavior 8(2), 113–128 (1976)Google Scholar
  7. 7.
    Jagacinski, R.J., Repperger, D.W., Ward, S.L., Moran, M.S.: A test of Fitts’ Law with Moving Targets. Human Factors 22(2), 225–233 (1980)Google Scholar
  8. 8.
    Hwang, F., Keates, S., Langdon, P.M., Clarkson, P.J.: Movement time prediction for tasks assisted by Force-feedback. In: Chapter 15 in Designing a more inclusive world ed Keates, Springer, Heidelberg (2004)Google Scholar
  9. 9.
    Phillips, C.A.: Human Factors Engineering. Wiley, New York (2000)Google Scholar
  10. 10.
    McRuer, D.T., Jex, H.R.: A Review of Quasi-Linear Pilot Models. IEEE Trans. Human Factors in Electronics HFE-8(3), 231–249 (1967)CrossRefGoogle Scholar
  11. 11.
    Robinson, G.H.: Dynamics of the eye and head during movement between displays: A qualitative and Quantitive Guide for Designers. Human Factors 23(3), 343–352 (1979)Google Scholar
  12. 12.
    Young, L.R., Stark, L.: ariable Feedback Experiments Testing a Sampled Data Model for Eye Tracking Movements. IEEE Trans. On Human Factors in Electronics 4, 38–51 (1963)CrossRefGoogle Scholar
  13. 13.
    Churchland, M.M., Lisberger, S.G.: Experimental and Computational Analysis of Monkey Smooth Pursuit Eye Movements. J. Neurophysiology 6, 741–759 (2001)Google Scholar
  14. 14.
    MacKenzie, I.S., Jusoh, S.: An evaluation of two input devices for remote pointing. In: Proc. 8th IFIP Working Conf. on Engineering for Human –Computer Interaction, pp. 235–249 (2001)Google Scholar
  15. 15.
    Sears, A., Lin, M., Karimullah, A.S.: Speech-based cursor control: understanding the effects of target size, cursor speed and command selection. UAIS 2, 30–43 (2002)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2007

Authors and Affiliations

  • Anthony S. White
    • 1
  • Stephen Prior
    • 2
  1. 1.School of Computing Science, Middlesex University, The Burroughs, Hendon, London, NW4 4BT 
  2. 2.Product Design and Engineering, Middlesex University, Bramley Rd, Trent Park, Enfield, London, N14 4YZ 

Personalised recommendations