Abstract
Two empirical (based on the empirical Bayes approach) adaptation algorithms are considered for informational (i.e., with incomplete observation) non-Bayes systems. Their corresponding convergence rates are calculated, which are compared with the convergence rate of the optimal algorithm based on estimation of the maximum likelihood. The conditions for coincidence of the indicated rates whose fulfillment causes the considered empirical algorithms to be optimal are established. For one of the considered algorithms fulfillment of the optimality condition turns out to be possible when (and only when) the probability distribution of the observed random quantity allows the existence of efficient estimates of the unknown parameters.
Similar content being viewed by others
Literature cited
B. A. Grishanin, this issue, p. 341.
G. Robbins, Matematika,10, No. 5, 122 (1966).
R. L. Stratonovich, Avtomat i Telemekh., No. 1, 96 1967).
Ya. Z. Tsypkin, Adaptation, Learning, and Self-Learning in Automatic System [in Russian], Nauka, Moscow (1968).
S. Rao, Linear Statistical Methods and Their Applications [in Russian], Nauka, Moscow (1968).
Author information
Authors and Affiliations
Additional information
Translated from Izvestiya Vysshikh Uchebnykh Zavedenii, Radiofizika, Vol. 15, No. 3, pp. 462–473, March, 1972.
Rights and permissions
About this article
Cite this article
Grishanin, B.A. Conditions governing the efficiency of adaptation algorithms which are based on the empirical Bayes approach to statistics. Radiophys Quantum Electron 15, 348–356 (1972). https://doi.org/10.1007/BF02210676
Received:
Issue Date:
DOI: https://doi.org/10.1007/BF02210676