Advertisement

Abstract

High-resolution spectroscopy is a powerful industrial tool. The number of features (wavelengths) in these data sets varies from several hundreds up to a thousand. Relevant feature selection/extraction algorithms are necessary to handle data of such a large dimensionality. One of the possible solutions is the SVM shaving technique. It was developed for applications in microarray data, which also have a huge number of features. The fact that the neighboring features (wavelengths) are highly correlated allows one to propose the SVM band-shaving algorithm, which takes into account the prior knowledge on the wavelengths order. The SVM band-shaving has a lower computational demands than the standard SVM shaving and selects features organized into bands. This is preferable due to possible noise reduction and a more clear physical interpretation.

Keywords

Feature Selection Method Band Extraction Lower Computational Demand Total Classification Error High Dimensional Microarray Data 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Guyon, I., Weston, J., Barnhill, S., Vapnik, V.: Gene Selection for Cancer Classification using Support Vector Machines. Machine Learning 46(13), 389–422 (2002)zbMATHCrossRefGoogle Scholar
  2. 2.
    Hastie, T., Tibshirani, R., et al.: Gene Shaving: a New Class of Clustering Methods for Expression Arrays. Technical report, Department of Statistics, Stanford University (2000), http://www-stat.stanford.edu/~hastie/Papers/
  3. 3.
    Hastie, T., Tibshirani, R., et al.: ”Gene shaving” as a method for identifying distinct sets of genes with similar expression patterns. Genome Biology 1(2) ,pp.research0003.1–0003.21 (2000)Google Scholar
  4. 4.
    Kumar, S., Ghosh, J.J., Crawford, M.M.: Best-Bases Feature Extraction Algorithms for Classification of Hyperspectral Data. IEEE Transactions on Geoscience and Remote Sensing 39(7), 1368–1379 (2001)CrossRefGoogle Scholar
  5. 5.
    Schölkopf, B., Smola, A., Williamson, R.C., Bartlett, P.L.: New support vector algorithms. Neural Computation 12, 1207–1245 (2000)CrossRefGoogle Scholar
  6. 6.
    Savitzky, A., Golay, M.J.E.: Smoothing and Differentiation of Data by Simplified Least Squares Procedures. Analytical Chemistry 36(8), 1627–1639 (1964)CrossRefGoogle Scholar
  7. 7.
    Landgrebe, D.: Signal theory methods in multispectral remote sensing. John Wiley & Sons, Chichester (2003)CrossRefGoogle Scholar
  8. 8.
    Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New York (2000)zbMATHGoogle Scholar
  9. 9.
    Ma, J., Zhao, Y., Ahalt, S.: OSU SVM Classifier Matlab Toolbox (version 3.00), Ohio State University, Columbus, USA (2002), http://www.ece.osu.edu/~maj/osu_svm/

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Serguei Verzakov
    • 1
  • Pavel Paclík
    • 1
  • Robert P. W. Duin
    • 1
  1. 1.Information and Communication Theory Group, Faculty of Electrical Engineering, Mathematics and Computer ScienceDelft University of TechnologyDelftThe Netherlands

Personalised recommendations