Algorithms for Obtaining Parsimonious Higher Order Neurons

  • Can Eren SezenerEmail author
  • Erhan Oztop
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10613)


Most neurons in the central nervous system exhibit all-or-none firing behavior. This makes Boolean Functions (BFs) tractable candidates for representing computations performed by neurons, especially at finer time scales, even though BFs may fail to capture some of the richness of neuronal computations such as temporal dynamics. One biologically plausible way to realize BFs is to compute a weighted sum of products of inputs and pass it through a heaviside step function. This representation is called a Higher Order Neuron (HON). A HON can trivially represent any n-variable BF with \(2^n\) product terms. There have been several algorithms proposed for obtaining representations with fewer product terms. In this work, we propose improvements over previous algorithms for obtaining parsimonious HON representations and present numerical comparisons. In particular, we improve the algorithm proposed by Sezener and Oztop [1] and cut down its time complexity drastically, and develop a novel hybrid algorithm by combining metaheuristic search and the deterministic algorithm of Oztop [2].


Higher Order Neurons (HON) Finer Time Scale Deterministic Algorithm Bent Functions Density Threshold 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Sezener, C.E., Oztop, E.: Minimal sign representation of boolean functions: algorithms and exact results for low dimensions. Neural Comput. 27(8) (2015)Google Scholar
  2. 2.
    Oztop, E.: Sign-representation of boolean functions using a small number of monomials. Neural Netw. 22(7), 938–948 (2009)CrossRefzbMATHGoogle Scholar
  3. 3.
    Oztop, E.: An upper bound on the minimum number of monomials required to separate dichotomies of \(-\)1, 1\(^{\text{n}}\). Neural Comput. 18(12) (2006)Google Scholar
  4. 4.
    Amano, K.: New upper bounds on the average PTF density of boolean functions. In: Cheong, O., Chwa, K.-Y., Park, K. (eds.) ISAAC 2010. LNCS, vol. 6506, pp. 304–315. Springer, Heidelberg (2010). doi: 10.1007/978-3-642-17517-6_28 CrossRefGoogle Scholar
  5. 5.
    Siu, K.Y., Roychowdhury, V., Kailath, T.: Discrete Neural Computation. Prentice Hall, Englewood Cliffs (1995)zbMATHGoogle Scholar
  6. 6.
    Mel, B.W.: Information processing in dendritic trees. Neural Comput. 6 (1994)Google Scholar
  7. 7.
    Amaldi, E., Kann, V.: On the approximability of finding maximum feasible subsystems of linear systems. In: Enjalbert, P., Mayr, E.W., Wagner, K.W. (eds.) STACS 1994. LNCS, vol. 775, pp. 521–532. Springer, Heidelberg (1994). doi: 10.1007/3-540-57785-8_168 CrossRefGoogle Scholar
  8. 8.
    Saks, M.E.: Slicing the hypercube. In: London Mathematical Society Lecture Note Series 187: Surveys in Combinatorics. Cambridge University Press (1993)Google Scholar
  9. 9.
    O’Donnell, R., Servedio, R.: Extremal properties of polynomial threshold functions. In: Eighteenth Annual Conference on Computational Complexity (2003)Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.Bernstein Center for Computational NeuroscienceBerlinGermany
  2. 2.Ozyegin UniversityIstanbulTurkey

Personalised recommendations