Applied Mathematics and Optimization

, Volume 10, Issue 1, pp 367–377

On a discrete approximation of the Hamilton-Jacobi equation of dynamic programming

  • I. Capuzzo Dolcetta

DOI: 10.1007/BF01448394

Cite this article as:
Dolcetta, I.C. Appl Math Optim (1983) 10: 367. doi:10.1007/BF01448394


An approximation of the Hamilton-Jacobi-Bellman equation connected with the infinite horizon optimal control problem with discount is proposed. The approximate solutions are shown to converge uniformly to the viscosity solution, in the sense of Crandall-Lions, of the original problem. Moreover, the approximate solutions are interpreted as value functions of some discrete time control problem. This allows to construct by dynamic programming a minimizing sequence of piecewise constant controls.

Copyright information

© Springer-Verlag New York Inc 1983

Authors and Affiliations

  • I. Capuzzo Dolcetta
    • 1
    • 2
  1. 1.Istituto MatematicoUniversità di RomaRomeItaly
  2. 2.Department of MathematicsUniversity of MarylandCollege ParkUSA