Advertisement

Journal of Intelligent and Robotic Systems

, Volume 46, Issue 2, pp 163–180 | Cite as

Comparative Study of Improved Neurolinear Method to Two Other Novel Feature Extraction Techniques

  • Aistis Raudys
Article
  • 37 Downloads

Abstract

This paper compares Neurolinear feature extraction technique to two other feature extraction techniques. The author has developed all three methods recently, but in this paper, Neurolinear method has been improved even further. The study is performed in respect to performance, accuracy and suitability for visualisation. Experimental study is performed on a variety of datasets from numerous domains: Chemistry, biology, finance etc. In total, 19 real and two artificial datasets were used. Comparison is performed in terms of computational efficiency and accuracy. Results show that Neurolinear method achieves best accuracy while Best directions method is not so accurate but often much faster.

Key words

best directions binary correlation comparative study neurolinear 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Raudys, A.: A nonparametric data mapping technique for active initialization of the multilayer perceptron. In: Amin, A., et al. (eds.) Advances in Pattern Recognition, pp. 989–996. Springer, Berlin Heidelberg New York (1998)Google Scholar
  2. 2.
    Raudys, A., Long, J.A.: MLP based linear feature extraction for nonlinearly separable data. Pattern Analysis & Applications 4(4), 227–234 (2001)zbMATHCrossRefMathSciNetGoogle Scholar
  3. 3.
    Raudys, A.: High speed associative memories for feature extraction and visualisation. Pattern Recogn. Lett. 24(9/10), 1327–1339 (2003)Google Scholar
  4. 4.
    Haykin, S.: In: Edition, S. (ed.) Neural Networks: A Comprehensive Foundation. Macmillan College, New York (1999)Google Scholar
  5. 5.
    Sammon, J.W.: A nonlinear mapping for data structure analysis. IEEE Trans. Comput. 18, 401–409 (1969)CrossRefGoogle Scholar
  6. 6.
    Foley, D.H., Sammon, J.W.J.: An optimal set of discriminant vectors. IEEE Trans. Comput. 24, 281–289 (1975)zbMATHCrossRefGoogle Scholar
  7. 7.
    Raudys, A.: Boosting neural network feature extraction by reduced accuracy activation functions. Pattern Recogn. 36(6), 1343–1354 (2003)zbMATHCrossRefGoogle Scholar
  8. 8.
    Raudys, A.: A Non-parametric data mapping technique for active initialisation of the multilayer perceptron. In: Joint IAPR Int. Workshops/SSPR'98 and SPR'98. Springer, Berlin Heidelberg New York (1998)Google Scholar
  9. 9.
    Lloyd, S.P.: Least squares quantisation in PCM. IEEE Trans. Inf. Theory 28(2), 129–137 (1982)zbMATHCrossRefMathSciNetGoogle Scholar
  10. 10.
    Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd ed. Wiley, New York (2000)Google Scholar
  11. 11.
    Light, W. (ed.): Advances in Numerical Analysis: Wavelets, Subdivision Algorithms, and Radial Basis Functions. Clarendon, Oxford (1992)Google Scholar
  12. 12.
    Austin, J.: ADAM: A distributed associative memory for scene analysis. In: First International Conference on Neural Networks. San Diego (1987)Google Scholar
  13. 13.
    Austin, J.: Distributed associative memories for high-speed symbolic reasoning. Int. J. Fuzzy Sets Syst. 82, 223–233 (1995)CrossRefGoogle Scholar
  14. 14.
    Austin, J.: An associative neural architecture for invariant pattern classification. In: IEE International Conference on Artificial Neural Networks. IEE Conf. (1989)Google Scholar
  15. 15.
    Austin, J.: High speed invariant pattern recognition using adaptive neural networks. In: IEE Proceedings of Conference on Image Processing and its Applications. IEE Conference Publication, University of Warwick, UK (1989)Google Scholar
  16. 16.
    Austin, J., Lees, K., Kennedy, J.: Reasoning with Neural Networks. In IEE Colloquium “Real Time Knowledge Based Systems” (1995)Google Scholar
  17. 17.
    Austin, J., Lees, K.: A search engine based on neural correlation matrix memories. Neurocomputing 35(1–4): 55–72 (2000)zbMATHCrossRefGoogle Scholar
  18. 18.
    Davis, L.: Handbook of Genetic Algorithms. Van Nostrand Reinhold (1991)Google Scholar
  19. 19.
    Raudys, A., Long, J.A.: Modelling company credit ratings using a number of classification techniques. In: Conference on Cybernetics and Systems Research. Austrian Society for Cybernetic Studies, Vienna (2000)Google Scholar
  20. 20.
    Saudargiene, A.: Structurization of the covariance matrix by process type and block diagonal models in the classifier design. Informatica 10(2), 245–269 (1999)zbMATHGoogle Scholar

Copyright information

© Springer Science + Business Media B.V. 2006

Authors and Affiliations

  1. 1.Institute of Mathematics and InformaticsVilniusLithuania

Personalised recommendations