Advertisement

Modeling Drone Crossing Movement with Fitts’ Law

  • Kaito YamadaEmail author
  • Hiroki Usuba
  • Homei Miyashita
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11786)

Abstract

Drones have begun to find extensive use in commercial, scientific, recreational, agricultural, and military applications in recent times. Drone maneuvers involve several pointing and crossing operations. In this regard, previous studies have shown that drone pointing operations can be modeled by the two-part model. In this study, we conduct a crossing operation experiment to control a drone to fly through a frame with a target width. Subsequently, we verify the applicability of Fitts’ law and the two-part model to drone crossing operations. Fitts’ law and the two-part model are both found to be suitably valid for crossing operations (\(R^2 >0.940\)). Upon comparing the AIC values of the two models, we find that Fitts’ law, which has fewer parameters, is a better model for the crossing operation. Our results indicate that the drone operation time in crossing operations can be suitably predicted. In addition, based on models, we can compare drones and evaluate interfaces in drone crossing operations.

Keywords

Drone Pointing Crossing User performance model Fitts’ law Human-drone interaction 

References

  1. 1.
    Accot, J., Zhai, S.: Beyond Fitts’ law: models for trajectory-based HCI tasks. In: Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems, pp. 295–302 (1997)Google Scholar
  2. 2.
    MacKenzie, I.S., Sellen, A., Buxton, W.A.S.: A comparison of input devices in element pointing and dragging tasks. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 161–166. ACM (1991)Google Scholar
  3. 3.
    Card, S.K., English, W.K., Burr, B.J.: Evaluation of mouse, rate-controlled isometric joystick, step keys, and text keys for text selection on a CRT. Ergonomics 21(8), 601–613 (1978)CrossRefGoogle Scholar
  4. 4.
    Accot, J., Zhai, S.: Performance evaluation of input devices in trajectory-based tasks: an application of the steering law. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 466–472 (1999)Google Scholar
  5. 5.
    Kobayashi, M., Igarashi, T.: Ninja cursors: using multiple cursors to assist target acquisition on large screens. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 949–958. ACM (2008)Google Scholar
  6. 6.
    Yamada, K., Usuba, H., Miyashita, H.: Modeling drone pointing movement with Fitts’ law. In: Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems, CHI EA 2019, pp. LBW2519:1–LBW2519:6 (2019)Google Scholar
  7. 7.
    Zollmann, S., Hoppe, C., Langlotz, T., Reitmayr, G.: FlyAR: augmented reality supported micro aerial vehicle navigation. IEEE Trans. Vis. Comput. Graph. 20(4), 560–568 (2014)CrossRefGoogle Scholar
  8. 8.
    Hedayati, H., Walker, M., Szafir, D.: Improving collocated robot teleoperation with augmented reality. In: Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, pp. 78–86. ACM (2018)Google Scholar
  9. 9.
    Erat, O., Isop, W.A., Kalkofen, D., Schmalstieg, D.: Drone-augmented human vision: exocentric control for drones exploring hidden areas. IEEE Trans. Vis. Comput. Graph. 24(4), 1437–1446 (2018)CrossRefGoogle Scholar
  10. 10.
    Hall, B.D., Anderson, N., Leaf, K.: Improving human interfaces for commercial camera drone systems. In: Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems, pp. 112–117. ACM (2017)Google Scholar
  11. 11.
    Hansen, J.P., Alapetite, A., MacKenzie, I.S., Møllenbach, E.: The use of gaze to control drones. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 27–34. ACM (2014)Google Scholar
  12. 12.
    Cho, K., Cho, M., Jeon, J.: Fly a drone safely: evaluation of an embodied egocentric drone controller interface. Interact. Comput. 29(3), 345–354 (2017)Google Scholar
  13. 13.
    Kasahara, S., Niiyama, R., Heun, V., Ishii, H.: ExTouch: spatially-aware embodied manipulation of actuated objects mediated by augmented reality. In: Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction, pp. 223–228. ACM (2013)Google Scholar
  14. 14.
    Ramcharitar, A., Teather, R.J.: A Fitts’ law evaluation of video game controllers: thumbstick, touchpad and gyrosensor. In: Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems, pp. 2860–2866. ACM (2017)Google Scholar
  15. 15.
    Montazer, M.A., Vyas, S.K., Wentworth, R.N.: A study of human performance in a sewing task. In: Proceedings of the Human Factors Society Annual Meeting, vol. 31, pp. 590–594. SAGE Publications, Los Angeles (1987)Google Scholar
  16. 16.
    Drury, C.G., Dawson, P.: Human factors limitations in fork-lift truck performance. Ergonomics 17(4), 447–456 (1974)CrossRefGoogle Scholar
  17. 17.
    Reed, K., Peshkin, M., Colgate, J.E., Patton, J.: Initial studies in human-robot-human interaction: Fitts’ law for two people. In: 2004 IEEE International Conference on Robotics and Automation, Proceedings, ICRA 2004, vol. 3, pp. 2333–2338. IEEE (2004)Google Scholar
  18. 18.
    Zhai, S., Woltjer, R.: Human movement performance in relation to path constraint-the law of steering in locomotion. In: Virtual Reality, Proceedings, pp. 149–156. IEEE (2003)Google Scholar
  19. 19.
    Shoemaker, G., Tsukitani, T., Kitamura, Y., Booth, K.S.: Two-part models capture the impact of gain on pointing performance. ACM Trans. Comput.-Hum. Interact. (TOCHI) 19(4), 28 (2012)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Meiji UniversityTokyoJapan

Personalised recommendations