Advertisement

Sparse Correlation Kernel Analysis and Evolutionary Algorithm-Based Modeling of the Sensory Activity within the Rat’s Barrel Cortex

  • Mariofanna Milanova
  • Tomasz G. Smolinski
  • Grzegorz M. Boratyn
  • Jacek M. Zurada
  • Andrzej Wrobel
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2388)

Abstract

This paper presents a new paradigm for signal decomposition and reconstruction that is based on the selection of a sparse set of basis functions. Based on recently reported results, we note that this framework is equivalent to approximating the signal using Support Vector Machines. Two different algorithms of modeling sensory activity within the barrel cortex of a rat are presented. First, a slightly modified approach to the Independent Component Analysis (ICA) algorithm and its application to the investigation of Evoked Potentials (EP), and second, an Evolutionary Algorithm (EA) for learning an overcomplete basis of the EP components by viewing it as probabilistic model of the observed data. The results of the experiments conducted using these two approaches as well as a discussion concerning a possible utilization of those results are also provided.

Keywords

Support Vector Machine Basis Function Input Signal Independent Component Analysis Independent Component Analysis 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Haykin, S.: Neural networks: a comprehensive foundation. 3rd edn. Prentice Hall, Upper Saddle River, NJ (1999)zbMATHGoogle Scholar
  2. 2.
    Oja, E.: A simplified neural model as a principal components analyzer. J. of Mathematical Biology 15 (1982) 267–273zbMATHCrossRefMathSciNetGoogle Scholar
  3. 3.
    Amari, S.-I. and Cichocki, A.: Adaptive blind signal processing-neural network approaches. Proc. of the IEEE 86 (1998) 2026–2048CrossRefGoogle Scholar
  4. 4.
    Barlow, H. B.: Possible principles underlying the transformations of sensory messages. In: Rosenblith, W. A. (ed.): Sensory Communication. MIT Press, Cambridge, MA (1961) 217–234Google Scholar
  5. 5.
    Raz, J., Dickerson, L., and Turetsky, B.: A Wavelet Packet Model of Evoked Potentials. Brain and Language 66 (1999) 61–88CrossRefGoogle Scholar
  6. 6.
    Chen, S., Donoho, D. L., and Saunders, M. A.: Atomic decomposition by basis pursuit. Technical report, Dept. Stat., Stanford University (1996)Google Scholar
  7. 7.
    Lewicki, M. and Sejnowski, T.: Learning overcomplete representations. Neural Computation 12 (2000) 337–365CrossRefGoogle Scholar
  8. 8.
    Mallat, S. G. and Zhang, Z.: Matching pursuits with time-frequency dictionaries. IEEE Trans. on Signal Processing 41(12) (1993) 3397–3415zbMATHCrossRefGoogle Scholar
  9. 9.
    Olshausen, B. and Field, D. J.: Sparse coding with an overcomplete basis set: A strategy employed by V1? Vision Research 37(23) (1997) 3311–3325CrossRefGoogle Scholar
  10. 10.
    Olshausen, B.: Sparse codes and spikes. In: Rao, R. P. N., Olshausen, B. A., Lewicki, M. S. (eds.): Probabilistic Models of Perception and Brain Function. MIT Press, Cambridge, MA (2001)Google Scholar
  11. 11.
    Milanova, M., Wachowiak, M., Rubin, S., and Elmaghraby, A.: A Perceptual Learning Model on Topological Representation. Proc. of the IEEE International Joint Conference on Neural Networks, Washington, DC, July 15–19 (2001) 406–411Google Scholar
  12. 12.
    Freeman, W. J.: Measurement of Cortical Evoked Potentials by Decomposition of their Wave Forms. J. of Cybernetics and Information Science, 2-4 (1979) 22–56Google Scholar
  13. 13.
    Lewicki, M. S. and Olshausen, B. A.: Probabilistic Framework for Adaptation and Comparison of Image Codes. J. Opt. Soc. of Am., 16 (1999) 1587–1600CrossRefGoogle Scholar
  14. 14.
    Girosi, F.: An Equivalence Between Sparse Approximation and Support Vector Machines. Neural Computation 10 (1998) 1455–1480CrossRefGoogle Scholar
  15. 15.
    Vapnik, V.: The nature of Statistical Learning Theory. Springer-Verlag, Berlin Heidelberg New York (1995)zbMATHGoogle Scholar
  16. 16.
    Makeig, S., et al.: ICA Toolbox Tutorial. Available: http://www.cnl.salk.edu/~scott/tutorial/
  17. 17.
    Field, D. J.: What is the goal of sensory coding? Neural Computation 6 (1994) 559–601CrossRefGoogle Scholar
  18. 18.
    Yoshioka, M. and Omatu, S.: Independent Component Analysis using time delayed sampling. Presented at the IEEE International Joint Conference on Neural Networks, Como, Italy, July 24–27 (2000)Google Scholar
  19. 19.
    Kublik, E. and Musial, P.: Badanie ukladow czuciowych metoda potencjalow wywolanych (in Polish). Kosmos 46 (1997) 327–336Google Scholar
  20. 20.
    Wrobel, A., Kublik, E., and Musial, P.: Gating of the sensory activity within barrel cortex of the awake rat. Exp. Brain Res. 123 (1998) 117–123CrossRefGoogle Scholar
  21. 21.
    Kublik, E., Musial, P., and Wrobel, A.: Identification of Principal Components in Cortical Evoked Potentials by Brief Surface Cooling. Clinical Neurophysiology. 112 (2001) 1720–1725CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2002

Authors and Affiliations

  • Mariofanna Milanova
    • 1
  • Tomasz G. Smolinski
    • 2
  • Grzegorz M. Boratyn
    • 2
  • Jacek M. Zurada
    • 2
  • Andrzej Wrobel
    • 3
  1. 1.Department of Computer ScienceUniversity of Arkansas at Little RockLittle RockUSA
  2. 2.Computational Intelligence Laboratory Department of Electrical and Computer EngineeringUniversity of LouisvilleLouisvilleUSA
  3. 3.Laboratory of Visual System Department of NeurophysiologyNencki Institute of Experimental BiologyWarsawPoland

Personalised recommendations