Skip to main content
Log in

Dynamic diagnostic and decision procedures under uncertainty

  • System Analysis
  • Published:
Cybernetics and Systems Analysis Aims and scope

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

References

  1. R. Bellman, “Markovian decision processes,” J. Math. Mech., 6 (1957).

  2. D. Blackwell, “Discrete dynamic programming,” Ann. Math. Stat., 33 (1962).

  3. E. V. Dynkin and A. A. Yushkevich, Controlled Markovian Processes [in Russian], Nauka, Moscow (1975).

    Google Scholar 

  4. H. Mine and S. Osaki, Markovian Decision Processes, Elsevier, New York (1970).

    Google Scholar 

  5. D. Bertsekas and S. Shreve, Stochastic Optimal Control, Academic Press, New York (1978).

    Google Scholar 

  6. V. V. Baranov, “Model and methods of uniformly optimal stochastic control,” Avtomat. Telemekh., No. 5 (1992).

  7. V. V. Baranov, “Sequential methods of identification and adaptive control in stochastic systems,” Kibern. Sist. Analiz, No. 5 (1992).

  8. V. V. Baranov, “Computational methods of optimal stochastic control,” Zh. Vychisl. Mat. Mat. Fiz., No. 5 (1991).

Download references

Authors

Additional information

Translated from Kibernetika i Sistemnyi Analiz, No. 3, pp. 87–104, May–June, 1994.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Baranov, V.V. Dynamic diagnostic and decision procedures under uncertainty. Cybern Syst Anal 30, 387–399 (1994). https://doi.org/10.1007/BF02366473

Download citation

  • Received:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF02366473

Keywords

Navigation