Abstract
We propose a new machine learning concept called Randomized Machine Learning, in which model parameters are assumed random and data are assumed to contain random errors. Distinction of this approach from “classical” machine learning is that optimal estimation deals with the probability density functions of random parameters and the “worst” probability density of random data errors. As the optimality criterion of estimation, randomized machine learning employs the generalized information entropy maximized on a set described by the system of empirical balances. We apply this approach to text classification and dynamic regression problems. The results illustrate capabilities of the approach.
This work is partially supported by Russian Fund for Basic Research (project no. 16-07-00743).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
In particular, option transactions employ the mean values of financial tools having power dependence on random parameters [1].
- 2.
This treatment differs from the classical definition of robustness given in [7].
- 3.
Distribution of objects among n classes is reduced to \(C^2_n\) distributions among 2 classes of n.
References
Avellaneda, M.: Minimum-relative-entropy calibration of asset-pricing models. Int. J. Theor. Appl. Finance 1(04), 447–472 (1998)
Bishop, C.: Pattern Recognition and Machine Learning (Information Science and Statistics), 1st edn. 2006. corr. 2nd printing edn. Springer, New York (2007)
Boucheron, S., Bousquet, O., Lugosi, G.: Theory of classification: A survey of some recent advances. ESAIM: probab. Stat. 9, 323–375 (2005)
Flach, P.: Machine Learning: The Art and Science of Algorithms that Make Sense of Data. Cambridge University Press (2012)
Friedman, J., Hastie, T., Tibshirani, R.: The elements of statistical learning. In: Springer Series in Statistics, vol. 1. Springer, Berlin (2001)
Gonzalo, J.A., Muñoz, F.F., Santos, D.J.: Using a rate equations approach to model world population trends. Simulation 89(2), 192–198 (2013)
Huber, P.J.: Robust Statistics. Springer (2011)
Ioffe, A.D., Tikhomirov, V.M.: Teoriya ekstremal’nykh zadach (Theory of Extremal Problems). Nauka, Moscow (1974)
Jaynes, E.T.: Information theory and statistical mechanics. Phys. Rev. 106(4), 620–630 (1957)
Jaynes, E.T.: Probability Theory: The Logic of Science. Cambridge University Press (2003)
Kaashoek, M.A., et al.: Recent Advances in Operator Theory and its Applications: the Israel Gohberg Anniversary Volume, vol. 160. Springer Science & Business Media (2005)
Kapur, J.N.: Maximum-Entropy Models in Science and Engineering. Wiley (1989)
Kolmogorov, F.N., Fomin, S.V.: Elements of the Theory of Functions and Functional Analysis, vol 1. Courier Corporation (1999)
Kullback, S., Leibler, R.A.: On information and sufficiency. Ann. Math. Stat. 22(1), 79–86 (1951)
Popkov, Y.S.: Macrosystems theory and its applications (equilibrium models). In: Lecture Notes in Control and Information Sciences. Springer (1995)
Popkov, Y.S., Dubnov, Y.A., Popkov, A.Y.: New method of randomized forecasting using entropy-robust estimation: application to the world population prediction. Mathematics 4, 1–16 (2016a)
Popkov, Y.S., Popkov, A.Y., Darkhovsky, B.S.: Parallel monte carlo for entropy robust estimation. Math. Models Comput. Simul. 8(1), 27–39 (2016b)
Racine, J.S., Maasoumi, E.: A versatile and robust metric entropy test of time-reversibility, and other hypotheses. J. Econom. 138(2), 547–567 (2007)
Rosenblatt, F.: The perceptron, a Perceiving and Recognizing Automaton Project Para. Cornell Aeronautical Laboratory (1957)
Rubinstein, R.Y., Kroese, D.P.: Simulation and the Monte Carlo Method, vol. 707. Wiley (2011)
Tsypkin, Y.Z.: Osnovy teorii obuchayushchikhsya sistem (Foundations of Theory of Learning Systems). Nauka, Moscow (1970)
Tsypkin, Y.Z., Popkov, Y.S.: Teoriya nelineinykh impul’snykh sistem (Theory of Nonlinear Impulse Systems). Nauka, Moscow (1973)
Vapnik, V.N.: Vosstanovlenie zavisimostei po empiricheskim dannym (Restoration of Dependencies Using Emprirical Data). Nauka, Moscow (1979)
Vapnik, V.N., Chervonenkis, A.Y.: Teoriya raspoznavaniya obrazov. Nauka, Moscow (1974)
Volterra, V.: Theory of Functionals and of Integral and Integro–Differential Equations. Dover Publications (2005)
Witten, I.H., Frank, E.: Data Mining: Practical Machine Learning Tools and Techniques. Morgan Kaufmann (2005)
Yzerman, M.A., Braverman, E.M., Rozonoer, L.I.: Metod potentsialnykh funktsii v teorii obucheniya mashin. Nauka, Moscow (1970)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this chapter
Cite this chapter
Popkov, Y.S., Dubnov, Y.A., Popkov, A.Y. (2018). Introduction to the Theory of Randomized Machine Learning. In: Sgurev, V., Piuri, V., Jotsov, V. (eds) Learning Systems: From Theory to Practice. Studies in Computational Intelligence, vol 756. Springer, Cham. https://doi.org/10.1007/978-3-319-75181-8_10
Download citation
DOI: https://doi.org/10.1007/978-3-319-75181-8_10
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-75180-1
Online ISBN: 978-3-319-75181-8
eBook Packages: EngineeringEngineering (R0)