Discrete Event Dynamic Systems

, Volume 19, Issue 3, pp 377–422

Partially Observable Markov Decision Process Approximations for Adaptive Sensing

Authors

    • Colorado State University
  • Christopher M. Kreucher
    • Integrity Applications Incorporated
  • Alfred O. HeroIII
    • University of Michigan
Article

DOI: 10.1007/s10626-009-0071-x

Cite this article as:
Chong, E.K.P., Kreucher, C.M. & Hero, A.O. Discrete Event Dyn Syst (2009) 19: 377. doi:10.1007/s10626-009-0071-x

Abstract

Adaptive sensing involves actively managing sensor resources to achieve a sensing task, such as object detection, classification, and tracking, and represents a promising direction for new applications of discrete event system methods. We describe an approach to adaptive sensing based on approximately solving a partially observable Markov decision process (POMDP) formulation of the problem. Such approximations are necessary because of the very large state space involved in practical adaptive sensing problems, precluding exact computation of optimal solutions. We review the theory of POMDPs and show how the theory applies to adaptive sensing problems. We then describe a variety of approximation methods, with examples to illustrate their application in adaptive sensing. The examples also demonstrate the gains that are possible from nonmyopic methods relative to myopic methods, and highlight some insights into the dependence of such gains on the sensing resources and environment.

Keywords

Markov decision processPOMDPSensingTrackingScheduling

Copyright information

© Springer Science+Business Media, LLC 2009