Discrete Event Dynamic Systems

, Volume 19, Issue 3, pp 377–422

Partially Observable Markov Decision Process Approximations for Adaptive Sensing

  • Edwin K. P. Chong
  • Christopher M. Kreucher
  • Alfred O. HeroIII

DOI: 10.1007/s10626-009-0071-x

Cite this article as:
Chong, E.K.P., Kreucher, C.M. & Hero, A.O. Discrete Event Dyn Syst (2009) 19: 377. doi:10.1007/s10626-009-0071-x


Adaptive sensing involves actively managing sensor resources to achieve a sensing task, such as object detection, classification, and tracking, and represents a promising direction for new applications of discrete event system methods. We describe an approach to adaptive sensing based on approximately solving a partially observable Markov decision process (POMDP) formulation of the problem. Such approximations are necessary because of the very large state space involved in practical adaptive sensing problems, precluding exact computation of optimal solutions. We review the theory of POMDPs and show how the theory applies to adaptive sensing problems. We then describe a variety of approximation methods, with examples to illustrate their application in adaptive sensing. The examples also demonstrate the gains that are possible from nonmyopic methods relative to myopic methods, and highlight some insights into the dependence of such gains on the sensing resources and environment.


Markov decision process POMDP Sensing Tracking Scheduling 

Copyright information

© Springer Science+Business Media, LLC 2009

Authors and Affiliations

  • Edwin K. P. Chong
    • 1
  • Christopher M. Kreucher
    • 2
  • Alfred O. HeroIII
    • 3
  1. 1.Colorado State UniversityFort CollinsUSA
  2. 2.Integrity Applications IncorporatedAnn ArborUSA
  3. 3.University of MichiganAnn ArborUSA

Personalised recommendations