Skip to main content

A New Multidimensional Feature Transformation for Linear Classifiers and Its Applications

  • Conference paper
Machine Learning and Data Mining in Pattern Recognition (MLDM 2005)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 3587))

Abstract

In this paper, a new feature transformation method is introduced to decrease misclassification rate. Linear classifiers in general are not able to classify feature vectors which lie in a high dimensional feature space. When the feature vectors from difference classes have underlying distributions which are severely overlapped, it is even more difficult to classify those feature vectors with desirable performance. In this case, data reduction or feature transformation typically finds a feature subspace in which feature vectors can be well separated. However, it is still not possible to overcome misclassifications which results from the overlapping area. The proposed feature transformation increases the dimension of a feature vector by combining other feature vectors in the same class and then follows typical data reduction process. Significantly improved separability in terms of linear classifiers is achieved through such a sequential process and is identified in the experimental results.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Chou, W.: Discriminant-function-based minimum recognition error rate pattern-recognition approach to speech recognition. Proc. IEEE 88(8), 1201–1223 (2000)

    Article  Google Scholar 

  2. Hastie, T., Tibshirani, R.: Discriminant adaptive nearest neighbor classification. IEEE Trans. Pattern Anal. Machine Intell. 18(6), 607–616 (1996)

    Article  Google Scholar 

  3. Fisher, R.A.: The statistical utilization of multiple measurements. Annals of Eugenics 8, 376–386 (1938)

    Google Scholar 

  4. Buturovic, L.J.: Towards Bayes-optimal linear dimension reduction. IEEE Trans. Pattern Analysis and Machine Intelligence 16, 420–424 (1994)

    Article  Google Scholar 

  5. Hastie, T., Tibshirani, R.: Discriminant analysis by Gaussian mixtures. J. Royal Statistics Soc., B 58, 155–176 (1996)

    MATH  MathSciNet  Google Scholar 

  6. Rao, C.R.: The utilization of multiple measurements in problems of biological classification. J. Royal Statistical Soc., B 10, 159–203 (1948)

    MATH  Google Scholar 

  7. Hubert, M., Driessen, K.V.: Fast and robust discriminant analysis. Computational Statistics & Data Analysis 45(2), 301–320 (2004)

    Article  MATH  MathSciNet  Google Scholar 

  8. Poston, W.L., Marchette, D.J.: Recursive dimensionality reduction using Fisher’s linear discriminant. Pattern Recognition 31(7), 881–888 (1998)

    Article  Google Scholar 

  9. Loog, M., Duin, R.P.W., Haeb-Umbach, R.: Multiclass linear dimension reduction by weighted pairwise Fisher criteria. IEEE Trans. Pattern Analysis and Machine Intelligence 23(7), 762–766 (2001)

    Article  Google Scholar 

  10. Rueda, L., Oommen, B.J.: On optimal pairwise linear classifiers for normal distributions: The two-dimensional case. IEEE Trans. Pattern Analysis and Machine Intelligence 24(2), 274–280 (2002)

    Article  Google Scholar 

  11. Brunzell, H., Eriksson, J.: Feature reduction for classification of multidimensional data. Pattern Recognition 33, 1741–1748 (2000)

    Article  Google Scholar 

  12. Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. John Wiley & Sons, New York (2000)

    Google Scholar 

  13. Jain, A.K., Duin, R.P.W., Mao, J.: Statistical pattern recognition: A review. IEEE Trans. Pattern Analysis and Machine Intelligence 22, 4–37 (2000)

    Article  Google Scholar 

  14. UCI Repository of Machine Learning Databases (2004), http://www.ics.uci.edu/mlearn/mlrepository.html

  15. Aladjem, M.: Linear discriminant analysis for two classes via removal of classification structure. IEEE Trans. Pattern Analysis and Machine Intelligence 19(2), 187–192 (1997)

    Article  Google Scholar 

  16. Lotlikar, R., Kothari, R.: Adaptive linear dimensionality reduction for classification. Pattern Recognition 33(2), 177–350 (2000)

    Article  Google Scholar 

  17. Du, Q., Chang, C.-I.: A linear constrained distance-based discriminant analysis for hyperspectral image classification. Pattern Recognition 34(2), 361–373 (2001)

    Article  MATH  Google Scholar 

  18. Kshirsagar, A.M.: Multivariate Analysis. M. Dekker, New York (1972)

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Bak, E. (2005). A New Multidimensional Feature Transformation for Linear Classifiers and Its Applications. In: Perner, P., Imiya, A. (eds) Machine Learning and Data Mining in Pattern Recognition. MLDM 2005. Lecture Notes in Computer Science(), vol 3587. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11510888_27

Download citation

  • DOI: https://doi.org/10.1007/11510888_27

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-26923-6

  • Online ISBN: 978-3-540-31891-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics