Probabilistic Models Based on the Π-Sigmoid Distribution

  • Anastasios Alivanoglou
  • Aristidis Likas
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5064)


Mixture models constitute a popular type of probabilistic neural networks which model the density of a dataset using a convex combination of statistical distributions, with the Gaussian distribution being the one most commonly used. In this work we propose a new probability density function, called the Π-sigmoid, from its ability to form the shape of the letter “Π” by appropriately combining two sigmoid functions. We demonstrate its modeling properties and the different shapes that can take for particular values of its parameters. We then present the Π-sigmoid mixture model and propose a maximum likelihood estimation method to estimate the parameters of such a mixture model using the Generalized Expectation Maximization algorithm. We assess the performance of the proposed method using synthetic datasets and also on image segmentation and illustrate its superiority over Gaussian mixture models.


Probabilistic neural networks mixture models Π-sigmoid distribution orthogonal clustering image segmentation 


  1. 1.
    Pelleg, D., Moore, A.: Mixture of rectangles: Interpretable soft clustering. In: Proc. ICML (2001)Google Scholar
  2. 2.
    Vlassis, N., Likas, A.: A greedy EM algorithm for Gaussian mixture learning. Neural Processing Letters 15, 77–87 (2002)zbMATHCrossRefGoogle Scholar
  3. 3.
    McLachlan, J.G., Krishnan, T.: Finite Mixture Models. Wiley, Chichester (2000)zbMATHGoogle Scholar
  4. 4.
    McLachlan, J.G., Krishnan, T.: The EM algorithm and extensions. Marcel Dekker, New York (1997)zbMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Anastasios Alivanoglou
    • 1
  • Aristidis Likas
    • 1
  1. 1.Department of Computer ScienceUniversity of IoanninaIoanninaGreece

Personalised recommendations