Applied Mathematics and Optimization

, Volume 10, Issue 1, pp 367–377

On a discrete approximation of the Hamilton-Jacobi equation of dynamic programming

  • I. Capuzzo Dolcetta
Article

Abstract

An approximation of the Hamilton-Jacobi-Bellman equation connected with the infinite horizon optimal control problem with discount is proposed. The approximate solutions are shown to converge uniformly to the viscosity solution, in the sense of Crandall-Lions, of the original problem. Moreover, the approximate solutions are interpreted as value functions of some discrete time control problem. This allows to construct by dynamic programming a minimizing sequence of piecewise constant controls.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag New York Inc 1983

Authors and Affiliations

  • I. Capuzzo Dolcetta
    • 1
    • 2
  1. 1.Istituto MatematicoUniversità di RomaRomeItaly
  2. 2.Department of MathematicsUniversity of MarylandCollege ParkUSA

Personalised recommendations