Robot Surveillance and Security

Abstract

This chapter introduces the foundation for surveillance and security robots for multiple military and civilian applications. The key environmental domains are mobile robots for ground, aerial, surface water, and underwater applications. Surveillance literally means to watch from above, while surveillance robots are used to monitor the behavior, activities, and other changing information that are gathered for the general purpose of managing, directing, or protecting one’s assets or position. In a practical sense, the term surveillance is taken to mean the act of observation from a distance, and security robots are commonly used to protect and safeguard a location, some valuable assets, or personal against danger, damage, loss, and crime. Surveillance is a proactive operation, while security robots are a defensive operation. The construction of each type of robot is similar in nature with a mobility component, sensor payload, communication system, and an operator control station.

After introducing the major robot components, this chapter focuses on the various applications. More specifically, Sect. 61.3 discusses the enabling technologies of mobile robot navigation, various payload sensors used for surveillance or security applications, target detection and tracking algorithms, and the operator’s robot control console for human–machine interface (HMI). Section 61.4 presents selected research activities relevant to surveillance and security, including automatic data processing of the payload sensors, automatic monitoring of human activities, facial recognition, and collaborative automatic target recognition (ATR). Finally, Sect. 61.5 discusses future directions in robot surveillance and security, giving some conclusions and followed by references.

2-D

two-dimensional

3-D

three-dimensional

AI

artificial intelligence

AR

autoregressive

ATR

automatic target recognition

AUV

autonomous underwater vehicle

CC&D

camouflage, concealment, and deception

CMU

Carnegie Mellon University

COMINT

communication intelligence

DCT

discrete cosine transform

EO

electrooptical

ESM

electric support measure

FIRRE

family of integrated rapid response equipment

FLIR

forward looking infrared

FOPEN

foliage penetration

GCS

ground control station

GIS

geographic information system

GMTI

ground moving target indicator

GPS

global positioning system

HMI

human–machine interface

HRR

high resolution radar

IFSAR

interferometric SAR

IIR

infinite impulse response

IR

infrared

ISAR

inverse SAR

ISR

intelligence, surveillance and reconnaissance

KAIST

Korea Advanced Institute of Science and Technology

L/D

lift-to-drag

LAGR

learning applied to ground robots

LIDAR

light detection and ranging

LOS

line-of-sight

MAP

maximum a posteriori

MASINT

measurement and signatures intelligence

MDARS

mobile detection assessment and response system

MRHA

multiple resource host architecture

MTI

moving target indicator

OTH

over-the-horizon

PerceptOR

perception for off-road robotics

RF

radio frequency

RHIB

rigid hull inflatable boat

RSTA

reconnaissance, surveillance, and target acquisition

SAR

synthetic aperture radar

SBSS

space based space surveillance

space based space surveillance

SIGINT

signal intelligence

SNR

signal-to-noise ratio

SPAWAR

Space and Naval Warfare Systems Center

UAV

fielded unmanned aerial vehicle

UGV

unmanned ground vehicle

UHF

ultrahigh frequency

USV

unmanned surface vehicle

UUV

unmanned underwater vehicle

VHF

very high frequency

References

  1. 61.1
    B. Shoop: Product manager force protection systems: Equipment that protects and secures, Chem-Bio Def. Quart. 4(2), 8–11 (2007)Google Scholar
  2. 61.2
    I. Pavlidis, V. Morellas, P. Tsiamyrtzis, S. Harp: Urban surveillance systems: From the laboratory to the commercial world, Proceedings IEEE 89(10), 1478–1497 (2001)CrossRefGoogle Scholar
  3. 61.3
    H.R. Everett, D.W. Gage: From laboratory to warehouse: Security robots meet the real world, Int. J. Robotics Res. 18(7), 760–768 (1999)CrossRefGoogle Scholar
  4. 61.4
    A. Birk, H. Kenn: RoboGuard, a teleoperated mobile security robot, Control Eng. Pract. 10(11), 1259–1264 (2002)CrossRefGoogle Scholar
  5. 61.5
    M. Saptharishi, K.S. Bhat, C.P. Diehl, J.M. Dolan, P.K. Khosla: CyberScout: Distributed agents for autonomous reconnaissance and surveillance, Mechatron. Mach. Vis. 93, 100 (2000)Google Scholar
  6. 61.6
    T.A. Heath-Pastore, H.R. Everett, K. Bonner: Mobile robots for outdoor security applications, Am. Nucl. Soc. 8th Int. Top. Meet. Robotics Remote Syst., Pittsburgh (1999) pp. 25–29Google Scholar
  7. 61.7
    H.R. Everett, G. Gilbreath, T.A. Heath-Pastore, R.T. Laird: Controlling multiple security robots in a warehouse environment, AIAA-NASA Conf. Intell. Robots, Houston (1994)Google Scholar
  8. 61.8
    E. Kuiper, S. Nadjm-Tehrani: Mobility models for UAV group reconnaissance applications, Int. Conf. Wirel. Mob. Commun. (ICWMC) Bucharest, Romania (2006)Google Scholar
  9. 61.9
    B. Fletcher: New roles for UUVs intelligence, surveillance, and reconnaissance, 9th Pac. Congr. Mar. Sci. Technol., Honolulu (2000)Google Scholar
  10. 61.10
    A. Zelinsky, R.A. Jarvis, J.C. Byrne, S. Yuta: Planning paths of complete coverage of an unstructured environment by a mobile robot, Proc. 8th Int. Conf. Adv. Robotics (ICAR), Tsukuba (1993)Google Scholar
  11. 61.11
    D. Stein, J. Schoonmaker, E. Coolbaugh: Hyperspectral Imaging for Intelligence, Surveillance, and Reconnaissance (Space and Naval Warfare Systems Center, San Diego 2001), Biennial ReviewGoogle Scholar
  12. 61.12
    D. W. Gage, W. D. Bryan, H. G. Nguyen: Internetting tactical security sensor systems, Proceedings SPIE 3393 (1998) doi:10.1117/12.317683Google Scholar
  13. 61.13
    V. Morellas, I. Pavlidis, P. Tsiamyrtzis: DETER: Detection of events for threat evaluation and recognition, Mach. Vis. Appl. 15, 29–45 (2003)CrossRefGoogle Scholar
  14. 61.14
    W. Severson, R. Rimey: Reconnaissance, surveillance, and target acquisition in the UGV/Demo II program, Proc. 4th ATR Syst. Technol. Symp., Vol. I – Unclassif. (1994) pp. 129–142Google Scholar
  15. 61.15
    D. Hougen, R. Rimey, W. Severson: Description of the RSTA subsystem. In: Reconnaissance, Surveillance, and Target Acquisition for the Unmanned Ground Vehicle: Providing Surveillance `Eyes' for an Autonomous Vehicle, ed. by O. Firschein, T.M. Strat (Morgan Kaufman, New York 1997)Google Scholar
  16. 61.16
    P.Y. Oh, W.E. Green: CQAR: Closed quarter aerial design for reconnaissance, surveillance and target acquisition tasks in urban areas, Int. J. Comput. Intell. 1(4), 353–360 (2004)Google Scholar
  17. 61.17
    B. Bhanu: Evaluation of automatic target recognition algorithms, Proceedings SPIE 0435 (1983) doi:10.1117/12.936960Google Scholar
  18. 61.18
    C. Olson, D. Huttenlocher: Automatic target recognition by matching oriented edge pixels, IEEE Trans. Image Proc. 6(1), 103–113 (1997)CrossRefGoogle Scholar
  19. 61.19
    A. Vasile, R. Marino: Pose-independent automatic target detection and recognition using 3-D ladar data, Proceedings SPIE 5426 (2004) doi:10.1117/12.54676 Google Scholar
  20. 61.20
    H. Andreasson, M. Magnusson, A. Lilienthal: Has something changed here? autonomous difference detection for security patrol robots, Proc. IEEE/RSI Int. Conf. Intell. Robots Syst. (IROS), San Diego (2007) pp. 3429–3435Google Scholar
  21. 61.21
    A. Muccio, T.B. Scruggs: Moving target indicator (MTI) applications for unmanned aerial vehicles (UAVs), Proc. Int. Conf. Radar (RADAR), Adelaide (2003)Google Scholar
  22. 61.22
    E. Ribnick, N. Papanikolopoulos: Estimating 3D trajectories of periodic motions from stationary monocular views, Proc. Eur. Conf. Comput. Vis. (ECCV) (2008)Google Scholar
  23. 61.23
    C. Nehme, J. Crandall, M. Cummings: An operator function taxonomy for unmanned aerial vehicle missions, Proc. 12th Int. Command Control Res. Technol. Symp. (2007)Google Scholar
  24. 61.24
    T.E. Noell, F.W. DePiero: Reduced bandwidth video for remote vehicle operations, Proc. Assoc. Unmanned Veh. Syst. (AUVS) (Association for Unmanned Vehicle Systems, Washington 1993)Google Scholar
  25. 61.25
    T. Fong, C. Thorpe, C. Baur: Collaboration, dialogue, and human–robot interaction, 10th Int. Symp. Robotics Res., Lorne, Victoria (2001)Google Scholar
  26. 61.26
    Carroll, D., Nguyen,. C., Everett, H.R., and B. Frederick: Development and testing for physical security robots, SPIE Proceedings 5804 (2005) doi:10.1117/12.606235Google Scholar
  27. 61.27
    B. Shoop, M. Johnston, R. Goehring, J. Moneyhun, B. Skibba: Mobile detection assessment and response systems (MDARS): A force protection, physical security operational success, Proceedings SPIE 6230 (2006) doi:10.117/12.665939Google Scholar
  28. 61.28
    H.R. Everett, R.T. Laird, D.M. Carroll, G. Gilbreath, T.A. Heath-Pastore, R.S. Inderieden, T. Tran, K. Grant, D.M. Jaffee: Multiple Resource Host Architecture (MRHA) for the Mobile Detection Assessment Response System (MDARS) (Space and Naval Warfare Systems Center, San Diego 2000), Technical Document 3026, Revision AGoogle Scholar
  29. 61.29
    Y. Takahashi, I. Masuda: A visual interface for security robots, Proc. IEEE Int. Workshop Robot Human Commun., Tokyo (1992)Google Scholar
  30. 61.30
    R. Maini, H. Aggarwal: A comprehensive view of image enhancement techniques, J. Comput. 2(3), 8–13 (2010)Google Scholar
  31. 61.31
    H. Warston, H. Persson: Ground surveillance and fusion of ground target sensor data in a networked based defense, Proc. 7th Int. Conf. Inf. Fusion, Stock. (2004) pp. 1195–1201Google Scholar
  32. 61.32
    J. Loyall, J. Ye, S. Neema, N. Mahadevan: Model-based design of end-to-end quality of service in a multi-UAV surveillance and target tracking application, 2nd RTAS Workshop Model. Embed. Syst. (MoDES), Toronto (2004)Google Scholar
  33. 61.33
    A.J. Joshi, F. Porikli, N. Papanikolopoulos: Breaking the interactive bottleneck in multi-class classification with active selection and binary feedback, IEEE Conf. Comput. Vis. Pattern Recogn. (2010)Google Scholar
  34. 61.34
    S. Balakirsky: Semi-autonomous mobile target engagement system, Proc. Assoc. Unmanned Veh. Syst. (AUVS) (1993) pp. 927–946Google Scholar
  35. 61.35
    P. Turaga, R. Chellappa, V. Subrahmanian, O. Udrea: Machine recognition of human activities: A survey, IEEE Trans. Circuits Syst. Video Technol. 18(11), 1473–1488 (2008)CrossRefGoogle Scholar
  36. 61.36
    A. Shee, W.-Y. Chung: High accuracy human activity monitoring using neural network, IEEE 3rd Int. Conf. Converg. Hybrid Inf. Technol., Busan, South Korea (2008)Google Scholar
  37. 61.37
    J. Aggarwal, M. Ryoo: Human activity analysis: A review, ACM Comput. Surv. 43(3), 16 (2011)CrossRefGoogle Scholar
  38. 61.38
    A. Fernadez-Caballero, J. Castillo, J. Rodriguez-Sanchez: Human activity monitoring by local and global finite state machine, Expert Syst. Appl. 39, 6982–6993 (2012)CrossRefGoogle Scholar
  39. 61.39
    N. Bird, O. Masoud, N. Papanikolopoulos, A. Isaacs: Detection of loitering individuals in public transportation areas, IEEE Trans. Intell. Transp. Syst. 6(2), 167–177 (2005)CrossRefGoogle Scholar
  40. 61.40
    S. Saxena, F. Bremond, M. Thonnat, R. Ma: Crowd behavior recognition for video surveillance, Lect. Notes Comput. Sci. 5259, 970–981 (2008)CrossRefGoogle Scholar
  41. 61.41
    Z. Zhou, X. Chen, Y.-C. Chung, Z. He, T. Han, J. Keller: Activity analysis, summarization, and visualization for indoor human activity monitoring, IEEE Trans. Circuits Syst. Video Technol. 18(11), 1489–1498 (2008)CrossRefGoogle Scholar
  42. 61.42
    E. Ribnick, S. Atev, O. Masoud, N. Papanikolopoulos, R. Voyles: Real-time detection of camera tampering, IEEE Int. Conf. Adv. Video Signal Based Surveill. (AVSS), Sydney, Aust. (2006)Google Scholar
  43. 61.43
    L. Fiore, D. Fehr, R. Bodor, A. Drenner, G. Somasundaram, N. Papanikolopoulos: Multi-camera human activity monitoring, J. Intell. Robotic Syst. 52(1), 5–43 (2008)CrossRefGoogle Scholar
  44. 61.44
    E. Ribnick, S. Atev, O. Masoud, N. Papanikolopoulos, R. Voyles: Detection of thrown objects in indoor and outdoor scenes, IEEE/RSJ Int. Conf. Intell. Robots Syst. (IROS) (2007)Google Scholar
  45. 61.45
    N. Bird, S. Atev, N. Caramelli, R. Martin, O. Masoud, N. Papanikolopoulos: Real time, online detection of abandoned objects in public areas, IEEE Int. Conf. Robotics Autom. (ICRA) (2006) pp. 3775–3780Google Scholar
  46. 61.46
    P. Kilambi, E. Ribnick, A. Joshi, O. Masoud, N. Papanikolopoulos: Estimating pedestrian counts in groups, Computer Vis. Image Underst. (CVIU) (2008)Google Scholar
  47. 61.47
    D. Fehr, R. Sivalingam, V. Morellas, N. Papanikolopoulos, O. Lotfallah, Y. Park: Counting people in groups, Proc. 6th IEEE Int. Conf. Adv. Video Signal Based Surveill. (AVSS) (2009) pp. 152–157Google Scholar
  48. 61.48
    G. Somasundaram, V. Morellas, N. Papanikolopoulos, L. Austin: Counting pedestrians and bicycles in traffic scenes, Proc. 2009 IEEE Conf. Intell. Transp. Syst. Conf., St. Louis (2009)Google Scholar
  49. 61.49
    J.N.K. Liu, M. Wang, B. Feng: Ibotguard: An internet-based intelligent robot security system using invariant face recognition against intruder, IEEE Trans. Syst. Man Cybern C 35(1), 97–105 (2005)CrossRefGoogle Scholar
  50. 61.50
    W. Zhao, R. Chellappa, A. Rosenfeld: Face recognition: A literature survey, ACM Comput. Surv. 35(4), 399–458 (2003)CrossRefGoogle Scholar
  51. 61.51
    R. Gross, J. Shi, J. Cohn: Quo vadis Face Recognition? CMU-RI-TR-01-17 Report (Carnegie Mellon University, Pittsburgh 2001)Google Scholar
  52. 61.52
    P. Viola, M. Jones: Robust real-time face detection, Int. J. Comput. Vis. 57(2), 137–154 (2004)CrossRefGoogle Scholar
  53. 61.53
    J. Sauter, R. Mathews, A. Yinger, J. Robinson, J. Moody, S. Riddle: Distributed pheromone-based swarming control of unmanned air and ground vehicles for RSTA, Proceedings SPIE (2008) doi:10.1117/12.782271Google Scholar
  54. 61.54
    Y. Guo, L.E. Parker, R. Madhavan: Towards collaborative robots for infrastructure security applications. In: Mobile Robots: The Evolutionary Approach, Book Series on Intelligent Systems Engineering Series, ed. by N. Nedjah, L. dos Santos Coelho, L. de Macedo Mourelle (Springer-Verlag, Berlin 2006) pp. 185–200Google Scholar
  55. 61.55
    J. Feddema, C. Lewis, P. Klarer: Control of multiple robotic sentry vehicles, Proc. SPIE Unmanned Ground Veh. Technol. 3693, 212–223 (1999)CrossRefGoogle Scholar
  56. 61.56
    F. Shaw, E. Klavins: Distributed estimation and control for stochastically interacting robots, 47th IEEE Decis. Control Conf., Cancun (2008)Google Scholar
  57. 61.57
    S. Roumeliotis, G. Bekey: Distributed multirobot localization, IEEE Trans. Robotics Autom. 18(5), 781–795 (2002)CrossRefGoogle Scholar
  58. 61.58
    Y. Sun, J. Xiao, X. Li, F. Cabrera-Mora: Adaptive source localization by a mobile robot using signal power gradient in sensor networks, IEEE Glob. Commun. Conf., New Orleans (2008)Google Scholar
  59. 61.59
    X. Zhou, S. Roumeliotis: Robot-to-Robot relative pose estimation from range measurements, IEEE Trans. Robotics 24(6), 1379–1393 (2008)CrossRefGoogle Scholar
  60. 61.60
    D. Carroll, H.R. Everett, G. Gilbreath, K. Mullens: Extending mobile security robots to force protection missions, Proc. Progr. Robotics FIRA RoboWorld Congr. (2009)Google Scholar
  61. 61.61
    N. Hopper: Security and complexity aspects of human interactive proofs, 1st Workshop Human Interact. Proofs (HIP), Palo Alto (2002)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2016

Authors and Affiliations

  1. 1.Department of Electrical and Computer EngineeringUniversity of DenverDenverUSA
  2. 2.Department of Computer Science and EngineeringUniversity of MinnesotaMinneapolisUSA

Personalised recommendations