Abstract
This paper presents a novel type of artificial neural network, called neural plasma, which is tailored for classification tasks involving few observations with a large number of variables. Neural plasma learns to adapt its classification confidence by generating artificial training data as a function of its confidence in previous decisions. In contrast to multilayer perceptrons and similar techniques, which are inspired by topological and operational aspects of biological neural networks, neural plasma is motivated by aspects of high-level behavior and reasoning in the presence of uncertainty. The basic principles of the proposed model apply to other supervised learning algorithms that provide explicit classification confidence values. The empirical evaluation of this new technique is based on benchmarking experiments involving data sets from biotechnology that are characterized by the small-n-large-p problem. The presented study exposes a comprehensive methodology and is seen as a first step in exploring different aspects of this methodology.
Chapter PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
8 References
Brown, P.O. and Botstein, D. (1999) Exploring the new world of the genome with DNA microarrays. Nat. Gen. 21(1): 33–37.
Ross, D.T., Scherf, U., Eisen, M.B., et al. (2000) Systematic variation in gene expression patterns in human cancer cell lines. Nat. Gen. 24(3):227–235.
Alizadeh, A.A., Eisen, M.B., Davis, R.E., et al. (2000) Distinct types of diffuse large B-cell lymphoma identified by gene expression profiling. Nature 403:503–511.
Yeoh, E.J., Ross, M.E., Shurtleff, S.A., et al. (2002) Classification, subtype discovery, and prediction of outcome in pediatric acute lymphoblastic leukemia by gene expression profiling. Cancer Cell 1:133–143.
Somorjai, R.L., Dolenko, B., and Baumgartner, R. (2003) Class prediction and discovery using gene microarray and proteomics mass spectroscopy data: curses, caveats, cautions. Bioinformatics 19(12): 1484–1491.
Raviv, Y., and Intrator, N. (1996) Boostrapping with noise: An effective regularization technique. Connection Science 8(3–4):355–372.
Korb, K.B., Hope, L.R., Hughes, M.J. (2001) The evaluation of predictive learners: some theoretical and empirical results. Proc. 12 th Europ. Conf. Machine Learning, 276–287.
Dowe, D.L., Farr, G.E., Hurst, A.J., Lentin, K.L. (1996) Information-theoretic football tipping. Proc. 3 rd Austr. Conf. Math. & Computers in Sport, Australia, 233–241.
Koistinen, P., and Holmström, L. (1992) Kernel regression and backpropagation training with noise. Advances in Neural Inf. Proc. Sys. 4:1033–1039.
Reed, R., Oh., S., Marks, R.J. (1992) Regularization using jittered training data. Proc. Int. J. Conf. Neural Networks, Baltimore MD, III147–III152.
van Someren, E.P., Wessels, L.F.A., Reinders M.J.T, Backer, E. (2001) Robust genetic network modeling by adding noisy data. Proc. IEEE Workshop on Nonlinear Signal and Image Processing, Baltimore, Maryland.
Bishop, CM. (1994) Training with noise is equivalent to Tikhonov regularization. Neural Computation 7:108–116.
Chawla, N.V., Bowyer, K.W., Hall, L.O., Kegelmeyer, W.P. (2002) SMOTE: Synthetic Minority Over-sampling Technique. J. Art. Int. Res. 16:321–357.
Specht, D.F. (1990) Probabilistic neural networks. Neural Networks 3:109–118.
Ramaswamy, S., Tamayo, P., Rifkin, R., et al. (2001) Multiclass cancer diagnosis using tumor gene expression signatures. Proc. Natl. Acad. Sci. USA. 98(26), 15149–15154.
Slonim, D., Tamayo, P., Mesirov, J., et al. (2000) Class prediction and discovery using gene expression data. Proc. 4 th Ann. Int. Conf. Comp. Mol. Biol., Tokyo, Japan, 263–272.
Radmacher, M.D., McShane, L.M., Simon, R. (2002) A paradigm for class prediction using gene expression profiles. J. Comp. Bio. 9(3):505–511.
Duda R.O., Hart P.E., Stork D.G. (2001) Pattern Classification. 2nd ed., John Wiley & Sons, New York, p. 461.
Ambroise, C. and McLachlan, G.J. (2002) Selection bias in gene extraction on the basis of microarray gene expression data. Proc. Natl. Acad. Sci. USA 98:6562–6566.
Nadeau, C., and Bengio, Y. (2003) Inference for generalization error. Machine Learning52:239–281.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 International Federation for Information Processing
About this paper
Cite this paper
Berrar, D., Dubitzky, W. (2006). Neural Plasma. In: Bramer, M. (eds) Artificial Intelligence in Theory and Practice. IFIP AI 2006. IFIP International Federation for Information Processing, vol 217. Springer, Boston, MA . https://doi.org/10.1007/978-0-387-34747-9_17
Download citation
DOI: https://doi.org/10.1007/978-0-387-34747-9_17
Publisher Name: Springer, Boston, MA
Print ISBN: 978-0-387-34654-0
Online ISBN: 978-0-387-34747-9
eBook Packages: Computer ScienceComputer Science (R0)