Advertisement

Run-Time Performance Analysis of the Mixture of Experts Model

  • Giuliano Armano
  • Nima Hatami
Part of the Advances in Intelligent and Soft Computing book series (AINSC, volume 95)

Abstract

The Mixture of Experts (ME) model is one of the most popular ensemble methods used in pattern recognition and machine learning. Despite many studies on the theory and application of the ME model, to our knowledge, its training, testing, and evaluation costs have not been investigated yet. After analyzing the ME model in terms of number of required floating point operations, this paper makes an experimental comparison between the ME model and the recently proposed Mixture of Random Prototype Experts. Experiments have been performed on selected datasets from the UCI machine learning repository. Experimental results confirm the expected behavior of the two ME models, while highlighting that the latter performs better in terms of accuracy and run-time performance.

Keywords

Hide Layer Expert Model Online Evaluation Gating Function Expert Network 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Jacobs, R., Jordan, M., Barto, A.: Task decomposition through competition in a modular connectionist architecture: the what and where vision tasks. Tech rep. University of Massachusetts, Amherst, MA (1991)Google Scholar
  2. 2.
    Jacobs, R., Jordan, M., Nowlan, S., Hinton, G.: Adaptive mixtures of local experts. Neural Computation 3, 79–87 (1991)CrossRefGoogle Scholar
  3. 3.
    Jordan, M.I., Jacobs, R.A.: Hierarchical mixtures of experts and the EM algorithm. Neural Comp. 6, 181–214 (1994)CrossRefGoogle Scholar
  4. 4.
    Murphy, P.M., Aha, D.W.: UCI Repository of Machine Learning Databases, Dept. of Information and Computer Science, Univ. of California, Irvine (1994)Google Scholar
  5. 5.
    Haykin, S.: Neural Networks: A Comprehensive Foundation, 2nd edn. Prentice-Hall, Englewood Cliffs (1999)zbMATHGoogle Scholar
  6. 6.
    Armano, G., Hatami, N.: Mixture of Random Prototype-Based Local Experts. In: Graña Romay, M., Corchado, E., Garcia Sebastian, M.T. (eds.) HAIS 2010. LNCS (LNAI), vol. 6076, pp. 548–556. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  7. 7.
    Duda, R., Hart, P., Stork, D.: Pattern classification, 2nd edn. John Wiley & Sons, New York (2001)zbMATHGoogle Scholar
  8. 8.
    Hennessy, J., Patterson, D.: Computer architecture: a quantitative approach. Morgan Kaufmann, San Mateo (1990)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Giuliano Armano
    • 1
  • Nima Hatami
    • 1
  1. 1.DIEE - Department of Electrical and Electronic EngineeringUniversity of CagliariCagliariItaly

Personalised recommendations