Skip to main content

Advertisement

Log in

Mobile robot interception using human navigational principles: Comparison of active versus passive tracking algorithms

  • Published:
Autonomous Robots Aims and scope Submit manuscript

Abstract

We examined human navigational principles for intercepting a projected object and tested their application in the design of navigational algorithms for mobile robots. These perceptual principles utilize a viewer-based geometry that allows the robot to approach the target without need of time-consuming calculations to determine the world coordinates of either itself or the target. Human research supports the use of an Optical Acceleration Cancellation (OAC) strategy to achieve interception. Here, the fielder selects a running path that nulls out the acceleration of the retinal image of an approaching ball, and maintains an image that rises at a constant rate throughout the task. We compare two robotic control algorithms for implementing the OAC strategy in cases in which the target remains in the sagittal plane headed directly toward the robot (which only moves forward or backward). In the “passive” algorithm, the robot keeps the orientation of the camera constant, and the image of the ball rises at a constant rate. In the “active” algorithm, the robot maintains a camera fixation that is centered on the image of the ball and keeps the tangent of the camera angle rising at a constant rate. Performance was superior with the active algorithm in both computer simulations and trials with actual mobile robots. The performance advantage is principally due to the higher gain and effectively wider viewing angle when the camera remains centered on the ball image. The findings confirm the viability and robustness of human perceptual principles in the design of mobile robot algorithms for tasks like interception.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Aboufadel, E. 1996. A mathematician catches a baseball. American Mathematics Monitor, 103:870–878.

    Article  MATH  MathSciNet  Google Scholar 

  • Borgstadt, J.A. and Ferrier, N.J. 2000. Interception of a projectile using a human vision-based strategy. In IEEE Int. Conf. on Robotics and Automation.

  • Brancazio, P.J. 1985. Looking into Chapman's Homer. American Journal of Physics, 36:368–370.

    Google Scholar 

  • Chapman, S. 1968. Catching a baseball. American Journal of Physics, 36:868.

    Article  Google Scholar 

  • McBeath, M.K., Shaffer, D.M. et al. 1995. How baseball outfielders determine where to run to catch fly balls. Science, 268:569–573.

    Google Scholar 

  • McBeath, M.K., Sugar, T. et al. 2001. Comparison of active versus passive ball catching algorithms using robotic simulations [Abstract]. Journal of Vision, 1(3):193a.

    Google Scholar 

  • McBeath, M.K., Sugar, T.G. et al. 2002. Human and robotic catching of dropped balls and balloons. Journal of Vision [Abstract], 2(7):434.

    Article  Google Scholar 

  • McLeod, P. and Dienes, Z. 1993. Running to catch the ball. Nature, 362(6415).

  • McLeod, P. and Dienes, Z. 1996. Do fielders know where to go to catch the ball or only how to get there? Journal of Experimental Psychology: Human Perception and Performance, 22(3):531–543.

    Article  Google Scholar 

  • Milner, A. and Goodale, M. 1995. The visual brain in action. Oxford Psychology Series 27, Oxford University Press.

  • Miyazaki, F. and Mori, R. 2004. Realization of ball catching task using a mobile robot. In IEEE Int. Conf. on Networking Sensing and Control.

  • Mori, R. and Miyazaki, F. 2002. Examination of human ball catching strategy through autonomous mobile robot. In IEEE Int. Conf. on Robotics and Automation.

  • Mori, R. and Miyazaki, F. 2003. GAG (Gaining Angle of Gaze) strategy for ball tracking and catching Task-implementation of GAG to a mobile robot. In 11th Int. Conf. on Advanced Robotics.

  • Mundhra, K., Sugar, T.G. et al. 2003. Perceptual navigation strategy: A unified approach to interception of ground balls and fly balls. In IEEE Conf. on Robotics and Automation.

  • Mundhra, K., Suluh, A. et al. 2002. Intercepting a falling object: Digital video robot. In the Procedings of the IEEE Int. Conf. on Robotics and Automation.

  • Oudejans, R.R., Michaels, C.F. et al. 1996. The relevance of action in perceiving affordances: Perception of catchableness of fly balls. Journal of Experimental Psychology: Human Perception and Performance, 22(4):879–891.

    Article  Google Scholar 

  • Oudejans, R.R., Michaels, C.F. et al. 1999. Shedding some light on catching in the dark: Perceptual mechanism for catching fly balls. Journal of Experimental Psychology: Human Perception and Performance, 25(2):531–542.

    Article  Google Scholar 

  • Rozendaal, L.A. and van Soest, A.J. 2003. Optical acceleration cancellation: A viable interception strategy? Biological Cybernetics, 89(6):415–425.

    Article  MATH  Google Scholar 

  • Shaffer, D.M. and McBeath, M.K. 2002. Baseball outfielders maintain a linear optical trajectory when tracking uncatchable fly balls. Journal of Experimental Psychology: Human Perception and Performance, 28(2):335–348.

    Article  Google Scholar 

  • Sugar, T. and McBeath, M.K. 2001. Spatial navigational algorithms: Application to mobile robotics. In Vision Interface Annual Conf., VI 2001.

  • Suluh, A., Mundhra, K. et al. 2002. Spatial interception for mobile robots. In IEEE Int. Conf. on Robotics and Automation.

  • Suluh, A., Sugar, T. et al. 2001. Spatial navigation principles: Applications to mobile robotics. In IEEE Int. Conf. on Robotics and Automation.

Download references

Author information

Authors and Affiliations

Authors

Additional information

Thomas Sugar works in the areas of mobile robot navigation and wearable robotics assisting gait of stroke survivors. In mobile robot navigation, he is interested in combining human perceptual principles with mobile robotics. He majored in business and mechanical engineering for his Bachelors degrees and mechanical engineering for his Doctoral degree all from the University of Pennsylvania. In industry, he worked as a project engineer for W. L. Gore and Associates. He has been a faculty member in the Department of Mechanical and Aerospace Engineering and the Department of Engineering at Arizona State University. His research is currently funded by three grants from the National Sciences Foundation and the National Institutes of Health, and focuses on perception and action, and wearable robots using tunable springs.

Michael McBeath works in the area combining Psychology and Engineering. He majored in both fields for his Bachelors degree from Brown University and again for his Doctoral degree from Stanford University. Parallel to his academic career, he worked as a research scientist at NASA—Ames Research Center, and at the Interval Corporation, a technology think tank funded by Microsoft co-founder, Paul Allen. He has been a faculty member in the Department of Psychology at Kent State University and at Arizona State University, where he is Program Director for the Cognition and Behavior area, and is on the Executive Committee for the interdisciplinary Arts, Media, and Engineering program. His research is currently funded by three grants from the National Sciences Foundation, and focuses on perception and action, particularly in sports. He is best known for his research on navigational strategies used by baseball players, animals, and robots.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Sugar, T.G., McBeath, M.K., Suluh, A. et al. Mobile robot interception using human navigational principles: Comparison of active versus passive tracking algorithms. Auton Robot 21, 43–54 (2006). https://doi.org/10.1007/s10514-006-8487-8

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10514-006-8487-8

Keywords

Navigation