Advertisement

Regularization by adding redundant features

  • Marina Skurichina
  • Robert P. W. Duin
Feature Selection and Extraction
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1451)

Abstract

The Pseudo Fisher Linear Discriminant (PFLD) based on a pseudo-inverse technique shows a peaking behaviour of the generalization error for training sample sizes that are about the feature size: with an increase in the training sample size the generalization error at first decreases reaching the minimum, then increases reaching the maximum at the point where the training sample size is equal to the data dimensionality and afterwards begins again to decrease. A number of ways exist to solve this problem. In this paper it is shown that noise injection by adding redundant features to the data also helps to improve the generalization error of this classifier for critical training sample sizes.

Keywords

Pseudo Fisher linear discriminant critical sample size generalization error peaking behaviour noise injection 

Reference

  1. 1.
    K. Fukunaga, Introduction to Statistical Pattern Recognition. Academic Press, 400–407 (1990).Google Scholar
  2. 2.
    R.P.W. Duin, Small sample size generalization, Proceedings of 9th Scandinavian Conference on Image Analysis, Uppsala, Sweden, 957–964 (1995).Google Scholar
  3. 3.
    M. Skurichina and R.P.W. Duin, Stabilizing classifiers for very small sample sizes, Proceedings of ICPR, Vienna, Austria, 891–896 (1996).Google Scholar
  4. 4.
    M. Skurichina and R.P.W. Duin, Bagging for Linear Classifiers, Pattern Recognition, vol. 31, no. 7 (1998), in press.Google Scholar
  5. 5.
    Š. Raudys, M. Skurichina, T. Cibas and P. Gallinari, Optimal Regularization of Neural Networks and Ridge Estimates of the Covariance Matrix in Statistical Classification, In: Pattern Recognition and Image Analysis: Advances in Mathematical Theory and Applications (an Int. Journal of Russian Academy of Sciences), Vol. 5, No. 4, 1995, pp. 633–650.Google Scholar
  6. 6.
    H. Netten, I.T. Young, M. Prins, L.J. van Vliet, H.J. Tanke, J. Vrolijk, W. Sloos, Automation of Fluorescent dot counting in cell nuclei, Proceedings of the 12th Int. Conference on Pattern Recognition, Vol. 1, Jerusalem, 84–87 (1994).CrossRefGoogle Scholar
  7. 7.
    A. Hoekstra, H. Netten and D. de Ridder, A neural network applied to spot counting, Proceedings of ACSI'96, the Second Annual Conference of the Advanced School for Computing and Imaging, Lommel, Belgium, 224–229 (1996).Google Scholar
  8. 8.
    R.A. Fisher, The Use of multiple measurements in taxonomic problems, Annals of Eugenics 7, no. 2 (1936).Google Scholar
  9. 9.
    R.A. Fisher, The precision of discriminant functions, Annals of Eugenics 10, no. 4 (1940).Google Scholar
  10. 10.
    R. Rao, On some problems arising of discrimination with multiple characters, Sankya 9, 343–365 (1949).Google Scholar
  11. 11.
    Š. Raudys and V. Pikelis, On dimensionality, sample size, classification error and complexity of classification algorithm in pattern recognition, IEEE Transaction on Pattern Analysis and Machine Intelligence PAMI-2, no. 3, 242–252 (1980).Google Scholar
  12. 12.
    Š. Raudys and R.P.W. Duin, On expected classification error of the Fisher linear classifier with pseudo-inverse covariance matrix, Pattern Recognition Letters (1998), in press.Google Scholar
  13. 13.
    C. Cortes and V. Vapnik, Support-vector networks, Machine Learning, Vol. 20, No. 3, pp. 273–297 (1995).Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1998

Authors and Affiliations

  • Marina Skurichina
    • 1
  • Robert P. W. Duin
    • 1
  1. 1.Pattern Recognition Group, Department of Applied Physics, Faculty of Applied SciencesDelft University of TechnologyDelftThe Netherlands

Personalised recommendations