Skip to main content
Log in

Relative Entropy and Error Bounds for Filtering of Markov Processes

  • Published:
Mathematics of Control, Signals and Systems Aims and scope Submit manuscript

Abstract.

This paper considers the relative entropy between the conditional distribution and an incorrectly initialized filter for the estimation of one component of a Markov process given observations of the second component. Using the Markov property, we first establish a decomposition of the relative entropy between the measures on observation path space associated to different initial conditions. Using this decomposition, it is shown that the relative entropy of the optimal filter relative to an incorrectly initialized filter is a positive supermartingale. By applying the decomposition to signals observed in additive, white noise, a relative entropy bound is obtained on the integrated, expected, mean square difference between the optimal and incorrectly initialized estimates of the observation function.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Author information

Authors and Affiliations

Authors

Additional information

Date received: October 6, 1997. Date revised: April 9, 1999.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Clark, J., Ocone, D. & Coumarbatch, C. Relative Entropy and Error Bounds for Filtering of Markov Processes. Math. Control Signals Systems 12, 346–360 (1999). https://doi.org/10.1007/PL00009856

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/PL00009856

Navigation