Criterion for Minimum of Mean Information Deviation for Distinguishing Random Signals with Similar Characteristics
- 3 Downloads
The problem of distinguishing random signals with similar spectral and correlational characteristics is considered. To solve this problem, a criterion for a minimum of the mean divergence of the hypotheses taken with respect to the true distribution in the Kullback–Liebler information metric is proposed. Using this criterion, an optimal algorithm is synthesized, which allows achieving a guaranteed efficiency gain in discriminating random signals of similar structure. An example of its implementation in the problem of automatic speech recognition at the basic, phonetic level of signal processing is considered. Estimates of its effectiveness are obtained. Theoretical estimates of the effectiveness are confirmed by the results of the experiment. The author’s special-purpose information system was used for this. On the basis of the obtained results, recommendations are given for the practical application of the proposed criterion in problems of statistical signal processing, where a problem of verifying close statistical hypotheses arises.
Unable to display preview. Download preview PDF.
- 2.V. V. Skachkov, V. V. Chepkyi, H. D. Bratchenko, A. N. Efymchykov, “Entropy approach to the investigation of information capabilities of adaptive radio engineering system in conditions of intrasystem uncertainty,” Radioelectron. Commun. Syst. 58, No. 6, 241 (2015). DOI: 10.3103/S0735272715060011.CrossRefGoogle Scholar
- 8.V. V. Savchenko, “Solving the problem of multiple comparisons for automatic signal recognition at the output of the voice communication path,” Elektrosvyaz’, No. 12, 22 (2017).Google Scholar
- 10.E. Leman, Testing of Statistical Hypotheses [in Russian] (Nauka, Moscow, 1979).Google Scholar
- 11.A. A. Borovkov, Mathematical Statistics [in Russian] (Lan’, St. Petersburg, 2010).Google Scholar
- 12.B. R. Levin, Theoretical Fundamentals of Statistical Radioenginering, 3rd ed. (Radio i Svyaz’, Moscow, 1989).Google Scholar
- 18.S. Ya. Zhuk, V. I. Kovalev, “Algorithm for combined filtering of the speech signal and estimate of the synchronization error in a two-channel measuring system,” Radioelectron. Commun. Syst. 43, No. 6, 36 (2000). URI: http://radio.kpi.ua/article/view/S0021347000060078.Google Scholar
- 21.V. V. Savchenko, “Principle of minimax entropy for statistical classification problems,” Radioelectron. Commun. Syst. 33, No. 12, 35 (1990).Google Scholar
- 22.D. Zhou, J. C. Platt, S. Basu, Yi. Mao, “Learning from the Wisdom of crowds by minimax entropy,” Proc. of 25th Int. Conf. on Neural Information Processing Systems, 3–6 Dec. 2012, Lake Tahoe, Nevada, USA. 2012, Vol. 2, pp. 2195–2203.Google Scholar
- 23.S. L. Marple, Digital Spectral Analysis (Prentice Hall, Englewood Cliffs. NJ, 1987).Google Scholar
- 24.A. A. Bylinkin, S. L. Konov, “Methods of random signals recognition according to the form of spectrum,” Information Security Questions, No. 2, 30 (2017). URI: https://elibrary.ru/item.asp?id=29207442.Google Scholar
- 25.A. A. Konev, R. V. Meshcheryakov, I. A. Khodashynskiy, “Recognition of vowel sounds using first and second harmonics,” Proc. of Sixth Interdisciplinary Seminar on Analysis of Spoken Russian Language (Saint Petersburg State University, 2012) [ed. by A. L. Ronzhyn], p. 35.Google Scholar