Acta Applicandae Mathematica

, Volume 1, Issue 1, pp 17–41

On the Hamilton-Jacobi-Bellman equations

  • P. L. Lions

DOI: 10.1007/BF02433840

Cite this article as:
Lions, P.L. Acta Appl Math (1983) 1: 17. doi:10.1007/BF02433840


We consider general problems of optimal stochastic control and the associated Hamilton-Jacobi-Bellman equations. We recall first the usual derivation of the Hamilton-Jacobi-Bellman equations from the Dynamic Programming Principle. We then show and explain various results, including (i) continuity results for the optimal cost function, (ii) characterizations of the optimal cost function as the maximum subsolution, (iii) regularity results, and (iv) uniqueness results. We also develop the recent notion of viscosity solutions of Hamilton-Jacobi-Bellman equations.

AMS (MOS) subject classification (1980)


Key words

Optimal stochastic controldiffusion processesHamilton-Jacobi-Bellman equationsviscosity solutionsDynamic Programming Principle

Copyright information

© D. Reidel Publishing Co. 1983

Authors and Affiliations

  • P. L. Lions
    • 1
  1. 1.Ceremade, Universitè Paris IX-DauphineParis Cedex 16France