Skip to main content
Log in

On Beneš' bang-bang control problem

  • Published:
Applied Mathematics and Optimization Aims and scope Submit manuscript

Abstract

This paper treats Beneš' bang-bang control problem. We present a proof of the optimality of a specific control function which differs from those proofs known in the literature and which appears to be useful for other problems too. The idea is to discretize the problem, solve the latter by means of dynamic programming and then to go to the limit.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Balakrishnan AV (1980) On stochastic bang-bang control. Appl Math Optim 6:91–96

    Google Scholar 

  2. Bather JA (1963) Control charts and minimization of costs. J Royal Statist Soc B 25:49–80

    Google Scholar 

  3. Bather JA (1966) A continuous time inventory model. J Appl Prob 3:538–549

    Google Scholar 

  4. Bather JA (1968) A diffusion model for the control of a dam. J Appl Prob 5:55–71

    Google Scholar 

  5. Beneš VE (1974) Girsanov functionals and optimal bang-bang laws for final value stochastic control. Stochastic Proc Appl 2:127–140

    Google Scholar 

  6. Beneš VE (1975) Composition and invariance methods for solving some stochastic control problems. Adv Appl Prob 7:299–329

    Google Scholar 

  7. Beneš VE (1976) Full “bang” to reduce predicted miss is optimal. SIAM J Control Optim 14:62–83

    Google Scholar 

  8. Billingsley P (1968) Convergence of probability measures. J Wiley, New York

    Google Scholar 

  9. Chernoff H, Petkau AJ (1978) Optimal control of a Brownian motion. SIAM J Appl Math 34:717–731

    Google Scholar 

  10. Clark JMC, Davis MHA (1979) On “predicted miss” stochastic control problems. Stochastics 2:197–209

    Google Scholar 

  11. Christopeit N (1980) Existence of optimal stochastic control under partial observations. Z Wahrscheinlichkeitstheorie verw Gebiete 51:201–213

    Google Scholar 

  12. Dynkin EB, Yushkevich AA (1979) Controlled Markov processes. Springer-Verlag New York Heidelberg Berlin

    Google Scholar 

  13. Elliott RJ (1977) The optimal control of a stochastic system. SIAM J Control Optim 15:756–778

    Google Scholar 

  14. Hausmann UG (1978) On the stochastic maximum principle. SIAM J Control Optim 16:236–251

    Google Scholar 

  15. Ikeda N, Watanabe S (1977) A comparison theorem for solutions of stochastic differential equations and applications. Osaka J Math 14:619–633

    Google Scholar 

  16. Kushner HJ (1977) Probability methods for approximations in stochastic control and for elliptic equations. Academic Press, New York

    Google Scholar 

  17. Kushner HJ (1975) Existence results for optimal stochastic controls. J Opt Theory Appl 15:347–359

    Google Scholar 

  18. Rath JH (1977) The optimal policy for a controlled Brownian motion. SIAM J Appl Math 32:115–125

    Google Scholar 

  19. Shreve SE (1981) Reflected Brownian motion in the “bang-bang” control of Brownian drift. SIAM J Control Optim 19:469–478

    Google Scholar 

  20. Zvonkin AK (1974) A drift annihilating phase-space map for a diffusion. Math Sbornik 93. English translation 22:129–149

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Additional information

Communicated by A. V. Balakrishnan

This work was supported by the Sonderforschungsbereiche 21 and 72, University of Bonn, Bonn, West Germany

Rights and permissions

Reprints and permissions

About this article

Cite this article

Christopeit, N., Helmes, K. On Beneš' bang-bang control problem. Appl Math Optim 9, 163–176 (1982). https://doi.org/10.1007/BF01460123

Download citation

  • Accepted:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF01460123

Keywords

Navigation