Performance Evaluation and Metrics for Perception in Intelligent Manufacturing

  • Roger Eastman
  • Tsai Hong
  • Jane Shi
  • Tobias Hanning
  • Bala Muralikrishnan
  • S. Susan Young
  • Tommy Chang


An unsolved but important problem in intelligent manufacturing is dynamic pose estimation under complex environmental conditions—tracking an object’s pose and position as it moves in an environment with uncontrolled lighting and background. This is a central task in robotic perception, and a robust, highly accurate solution would be of use in a number of manufacturing applications. To be commercially feasible, a solution must also be benchmarked against performance standards so manufacturers fully understand its nature and capabilities. The PerMIS 2008 Special Session on “Performance Metrics for Perception in Intelligent Manufacturing,” held August 20, 2008, brought together academic, industrial and governmental researchers interested in calibrating and benchmarking vision and metrology systems. The special session had a series of speakers who each addressed a component of the general problem of benchmarking complex perception tasks, including dynamic pose estimation. The components included assembly line motion analysis, camera calibration, laser tracker calibration, super-resolution range data enhancement and evaluation, and evaluation of 6DOF pose estimation for visual servoing. This Chapter combines and summarizes the results of the special session, giving a framework for benchmarking perception systems and relating the individual components to the general framework.


Visual Servoing Laser Tracker Target Discrimination Range Camera Performance Evaluation Test 


  1. 1.
    Hutchinson, S.A., Hager, G.D., and Corke, P.I., A tutorial on visual servo control. IEEE Transactions on Robotics and Automation, 12(5):651–670, October 1996.CrossRefGoogle Scholar
  2. 2.
    Yoon, Y., DeSouza, G.N., and Kak, A.C., Real-time tracking and pose estimation for industrial objects using geometric features. Proceedings of the Int. Conference in Robotics and Automation, Taiwan, 2003.Google Scholar
  3. 3.
    Yoon, Y., Park, J., and Kak, A.C., A heterogeneous distributed visual servoing system for real-time robotic assembly applications. Proceedings of the International Conference on Robotics and Automation, Orlando, Florida, 2006.Google Scholar
  4. 4.
    Montemerlo, M., Thrun, S., Dahlkamp, H., Stavens, D., and Strohband, S., Winning the DARPA grand challenge with an AI robot. Proceedings of the AAAI National Conference on Artificial Intelligence, Boston, MA, 2006.Google Scholar
  5. 5.
    Darrell, T., Gordon, G., Harville, M., and Woodfill, J., Integrated person tracking using stereo, color, and pattern detection. International Journal of Computing Vision, 37(2):175–185, June 2000.MATHCrossRefGoogle Scholar
  6. 6.
    Scharstein, D., Szeliski, R., and Zabih, R., A taxonomy and evaluation of dense two-frame stereo correspondence algorithms. Workshop on Stereo and Multi-Baseline Vision (in conjunction with IEEE CVPR 2001), pp. 131–140, Kauai, Hawaii, December 2001.Google Scholar
  7. 7.
    Phillips, P.J., Flynn, P.J., Scruggs, T., Bowyer, K.W., Jin Chang, Hoffman, K., Marques, J., Jaesik Min, Worek, W., Overview of the face recognition grand challenge. Computer Vision and Pattern Recognition, 2005. CVPR 2005. IEEE Computer Society Conference on, 1:947–954, 20–25 June 2005.Google Scholar
  8. 8.
    Shi, J., Preliminary Analysis of Conveyor Dynamic Motion, PerMIS, August, 2008.Google Scholar
  9. 9.
    Hanning, T., and Lasaruk, A., Calibration of a System of a GrayValue Camera and an MDSI Range Camera, PerMIS, August, 2008.Google Scholar
  10. 10.
    Muralikrishnan, B., Sawyer, D., Blackburn, C., Phillips, S., Borchardt, B., and Estler, T.W., Performance Evaluation of Laser Trackers, PerMIS, August, 2008.Google Scholar
  11. 11.
    Hu, S., Young, S.S., and Hong, T., Performance of Super-Resolution Enhancement for LADAR Camera Data, PerMIS, 2008.Google Scholar
  12. 12.
    Chang, T., Hong, T., Shneier, M., Holguin, G., Park, J., and Eastman, R., Dynamic 6DOF Metrology for Evaluating a Visual Servoing Algorithm, PerMIS, August, 2008.Google Scholar
  13. 13.
    Mengel, P., Listl, L., Koenig, B., Toepfer, C., Pellkofer, M., Brockherde, W., Hosticka, B., Elkahili, O., Schrey, O., and Ulfig, W., Three-dimensional CMOS image sensor for pedestrian protection and collision mitigation. Advanced Microsystems for Automotive Applications 2006, Berlin, Springer, pp. 23–39, 2006.Google Scholar
  14. 14.
    Hanning, T., Lasaruk, A., and Wertheimer, R., MDSI range camera calibration. Advanced Microsystems for Automotive Applications, Berlin, Springer, 2008.Google Scholar
  15. 15.
    Zhang, Z., A Flexible new technique for camera calibration. Technical report, Microsoft Research Technical Report MSR-TR-98-71, 1998.Google Scholar
  16. 16.
    Hartley, R., and Zisserman, A., Epipolar Geometry and the Fundamental Matrix. In: Multiple View Geometry in Computer Vision. Cambridge, MA, Cambridge University Press, 2000.Google Scholar
  17. 17.
    Hanning, T., Schone, R., and Graf, S., A closed form solution for monocular re-projective 3d pose estimation of regular planar patterns. International Conference of Image Processing, Atlanta, Georgia, pp. 2197–2200, 2006.Google Scholar
  18. 18.
    Golub, G.H., and van Loan, C.F., Matrix Computations. 3 ed. Baltimore, John Hopkins University Press, 1996.MATHGoogle Scholar
  19. 19.
    Press, W.H., Flannery, B.P., Teukolsky, S.A., and Vetterling, W.T., Numerical Recipes in C, 2 ed. Cambridge, MA, Cambridge University Press, 1992.MATHGoogle Scholar
  20. 20.
    ASME B89.4.19-2006 Standard – Performance Evaluation of Laser-Based Spherical Coordinate Measurement Systems,, 2006.
  21. 21.
    Estler, W.T., Sawyer, D.S., Borchardt, B., and Phillips, S.D., Laser-scale metrology instrument performance evaluations at NIST. The Journal of the CMSC 1(2):27–32, 2006.Google Scholar
  22. 22.
    Sawyer, D.S., Borchardt, B., Phillips, S.D., Fronczek, C., Estler, W.T., Woo, W., and Nickey, R.W., A laser tracker calibration system. Proceedings of the Measurement Science Conference, Anaheim, CA, 2002.Google Scholar
  23. 23.
    Sawyer, D.S., NIST progress in the development of a deployable high-accuracy artifact for laser tracker performance evaluation per ASME B89.4.19. Proceedings of the CMSC conference, Charlotte, NC, July 21–25, 2008.Google Scholar
  24. 24.
    Loser, R., and Kyle, S., Alignment and field check procedures for the Leica Laser Tracker LTD 500. Boeing Large Scale Optical Metrology Seminar, 1999.Google Scholar
  25. 25.
    Ng, T.C., SIMTech Technical Reports, 6(1):13–18, 2005.Google Scholar
  26. 26.
    Committee on Army Unmanned Ground Vehicle Technology, Technology development for Army unmanned ground vehicles, Sandia Report, 2002.Google Scholar
  27. 27.
    MESA Imaging, 2006. SwissRanger SR-3000 Manual.
  28. 28.
    Rosenbush, G., Hong, T.H., and Eastman, R.D., Super-resolution enhancement of flash LADAR range data. Proceedings of the SPIE, 6736:67314, 2007.Google Scholar
  29. 29.
    Vandewalle P., Susstrunk S., and Vetterli, M., A frequency domain approach to registration of aliased images with application to super-resolution. EURASIP Journal on Applied Signal Processing, 2006, Article ID 71459, 2005.Google Scholar
  30. 30.
    Young, S.S., and Driggers, R.G., Super-resolution image reconstruction from a sequence of aliased imagery. Applied Optics, 45:5073–5085, 2006.CrossRefGoogle Scholar
  31. 31.
    Devitt, N., Moyer, S., and Young, S.S., Effect of image enhancement on the search and detection task in the urban terrain. In Proceedings of SPIE, Vol. 6207, 62070D 1-13, 2006.Google Scholar
  32. 32.
    Driggers, R.G., Krapels, K., Murrill, S., Young, S., Thielke, M., and Schuler, J., Super-resolution performance for undersampled imagers. Optical Engineering, 44(1):14002, 2005.CrossRefGoogle Scholar
  33. 33.
    Bijl, P., and Valeton, J.M., Triangle orientation discrimination: the alternative to MRTD and MRC. Optical Engineering, 37(7):1976–1983, 1998.CrossRefGoogle Scholar
  34. 34.
    Young, S.S., Driggers, R.G., and Jacobs, E.L., Signal Processing and Performance Analysis for Imaging Systems. Norwood, MA, Artech House, 2008.MATHGoogle Scholar
  35. 35.
    Anderson, D., Herman H., and Kelly A., Experimental characterization of commercial flash LADAR devices. In Proceedings of the International Conference of Sensing and Technology, 2005.Google Scholar
  36. 36.
    DeSouza, G.N., and Kak, A.C., A Subsumptive, Hierarchical, and Distributed Vision-Based Architecture for Smart Robotics, IEEE Transactions on Systems, Man, and Cybernetics – Part B: Cybernetics, 34(5):1988–2002, 2004.CrossRefGoogle Scholar
  37. 37.
    Hirsh, R., DeSouza, G.N., and Kak, A.C., An Iterative Approach to the Hand-Eye and Base-World Calibration Problem, Proceedings of the IEEE International Conference on Robotics and Automation, Seoul, May 2001.Google Scholar
  38. 38.
    Hutchinson, S.A., Hager, G.D., and Corke, P.I, A tutorial on visual servo control. IEEE Transactions on Robotics and Automation, 12(5):651–670, October 1996.CrossRefGoogle Scholar
  39. 39.
    Yoon, Y., Park, J., and Kak, A.C., A Heterogeneous Distributed Visual Servoing System for Real-time Robotic Assembly Applications, in Proceedings of the International Conference on Robotics and Automation, Orlando, Florida, 2006.Google Scholar
  40. 40.
    SMARTTRACK SENSOR, Automated Precision Inc.,, 2007.

Copyright information

© Springer Science+Business Media, LLC 2009

Authors and Affiliations

  • Roger Eastman
    • 1
  • Tsai Hong
    • 2
  • Jane Shi
    • 3
  • Tobias Hanning
    • 4
  • Bala Muralikrishnan
    • 2
  • S. Susan Young
    • 5
  • Tommy Chang
    • 2
  1. 1.Loyola University MarylandMarylandUSA
  2. 2.National Institute of Standards and Technology, GaithersburgQueryUSA
  3. 3.General Motors Research and DevelopmentDetroitUSA
  4. 4.University of PassauPassauGermany
  5. 5.Army Research LaboratoryAdelphiUSA

Personalised recommendations