Attractor Neural Network Combined with Likelihood Maximization Algorithm for Boolean Factor Analysis

  • Alexander A. Frolov
  • Dušan Húsek
  • Pavel Yu. Polyakov
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7367)


When large data sets are analyzed, the pursuit of their appropriate representation in the space of lower dimension is a common practice. Boolean factor analysis can serve as a powerful tool to solve the task, when dealing with binary data. Here we provide a short insight into a new approach to Boolean factor analysis we have developed as an extension of our previously proposed method: Hopfield-like Attractor Neural Network with Increasing Activity. We have greatly enhanced its functionality, having complemented this method by maximizing the data set likelihood function. We have defined this Likelihood function on the basis of the data generative model proposed previously. As a result, in such a way we can obtain a full set of generative model parameters. We demonstrate the efficiency of the new method using the artificial signals, which are random mixtures of horizontal and vertical bars that are a benchmark for Boolean factor analysis. Then we show that the method can be used for real task solving when analyzing data from the Kyoto Encyclopedia of Genes and Genomes.


Lyapunov Function Factor Score Information Gain Likelihood Maximization Algorithm Likelihood Maximization Procedure 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Frolov, A.A., Husek, D., Muraviev, I.P., Polyakov, P.Y.: Boolean factor analysis by attractor neural network. IEEE Transactions on Neural Networks 18(3), 698–707 (2007)CrossRefGoogle Scholar
  2. 2.
    Frolov, A.A., Husek, D., Polyakov, P.Y.: Recurrent neural network based Boolean factor analysis and its application to automatic terms and documents categorization. IEEE Transactions on Neural Networks 20(7), 1073–1086 (2009)CrossRefGoogle Scholar
  3. 3.
    Frolov, A., Husek, D., Polyakov, P.: Estimation of Boolean Factor Analysis Performance by Informational Gain. In: Snášel, V., Szczepaniak, P.S., Abraham, A., Kacprzyk, J. (eds.) Advances in Intelligent Web Mastering - 2. AISC, vol. 67, pp. 83–94. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  4. 4.
    Foldiak, P.: Forming sparse representations by local anti-hebbian learning. Biological Cybernetics 64, 165–170 (1990)CrossRefGoogle Scholar
  5. 5.
    Pellegrini, M., Marcotte, E., Thompson, M., Eisenberg, D., Yeates, T.: Assigning protein functions by comparative genome analysis: protein phylogenetic profiles. Proceedings of the National Academy of Sciences of the United States of America 96(8), 4285 (1999)CrossRefGoogle Scholar
  6. 6.
    Von Mering, C., Krause, R., Snel, B., Cornell, M., Oliver, S., Fields, S., Bork, P.: Comparative assessment of large-scale data sets of protein–protein interactions. Nature 417(6887), 399–403 (2002)CrossRefGoogle Scholar
  7. 7.
    Ravasz, E., Somera, A., Mongru, D., Oltvai, Z., Barabási, A.: Hierarchical organization of modularity in metabolic networks. Science 297(5586), 1551 (2002)CrossRefGoogle Scholar
  8. 8.
    Kanehisa, M., Goto, S., Kawashima, S., Nakaya, A.: The KEGG databases at GenomeNet. Nucleic Acids Research 30(1), 42 (2002)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Alexander A. Frolov
    • 1
  • Dušan Húsek
    • 2
  • Pavel Yu. Polyakov
    • 1
    • 3
  1. 1.Institute of Higher Nervous Activity and NeurophysiologyRussian Academy of SciencesMoscowRussia
  2. 2.Institute of Computer ScienceAcademy of Sciences of the Czech RepublicPrague 8Czech Republic
  3. 3.VSB Technical University of OstravaOstravaCzech Republic

Personalised recommendations