Discrete Event Dynamic Systems

, Volume 19, Issue 3, pp 377-422

First online:

Partially Observable Markov Decision Process Approximations for Adaptive Sensing

  • Edwin K. P. ChongAffiliated withColorado State University Email author 
  • , Christopher M. KreucherAffiliated withIntegrity Applications Incorporated
  • , Alfred O. HeroIIIAffiliated withUniversity of Michigan

Rent the article at a discount

Rent now

* Final gross prices may vary according to local VAT.

Get Access


Adaptive sensing involves actively managing sensor resources to achieve a sensing task, such as object detection, classification, and tracking, and represents a promising direction for new applications of discrete event system methods. We describe an approach to adaptive sensing based on approximately solving a partially observable Markov decision process (POMDP) formulation of the problem. Such approximations are necessary because of the very large state space involved in practical adaptive sensing problems, precluding exact computation of optimal solutions. We review the theory of POMDPs and show how the theory applies to adaptive sensing problems. We then describe a variety of approximation methods, with examples to illustrate their application in adaptive sensing. The examples also demonstrate the gains that are possible from nonmyopic methods relative to myopic methods, and highlight some insights into the dependence of such gains on the sensing resources and environment.


Markov decision process POMDP Sensing Tracking Scheduling