Machine Learning

, Volume 42, Issue 1, pp 177–196

Unsupervised Learning by Probabilistic Latent Semantic Analysis

  • Thomas Hofmann

DOI: 10.1023/A:1007617005950

Cite this article as:
Hofmann, T. Machine Learning (2001) 42: 177. doi:10.1023/A:1007617005950


This paper presents a novel statistical method for factor analysis of binary and count data which is closely related to a technique known as Latent Semantic Analysis. In contrast to the latter method which stems from linear algebra and performs a Singular Value Decomposition of co-occurrence tables, the proposed technique uses a generative latent class model to perform a probabilistic mixture decomposition. This results in a more principled approach with a solid foundation in statistical inference. More precisely, we propose to make use of a temperature controlled version of the Expectation Maximization algorithm for model fitting, which has shown excellent performance in practice. Probabilistic Latent Semantic Analysis has many applications, most prominently in information retrieval, natural language processing, machine learning from text, and in related areas. The paper presents perplexity results for different types of text and linguistic data collections and discusses an application in automated document indexing. The experiments indicate substantial and consistent improvements of the probabilistic method over standard Latent Semantic Analysis.

unsupervised learninglatent class modelsmixture modelsdimension reductionEM algorithminformation retrievalnatural language processinglanguage modeling
Download to read the full article text

Copyright information

© Kluwer Academic Publishers 2001

Authors and Affiliations

  • Thomas Hofmann
    • 1
  1. 1.Department of Computer ScienceBrown UniversityProvidenceUSA