Advertisement

An Extension of the CHAID Tree-based Segmentation Algorithm to Multiple Dependent Variables

  • Jay Magidson
  • Jeroen K. Vermunt
Part of the Studies in Classification, Data Analysis, and Knowledge Organization book series (STUDIES CLASS)

Abstract

The CHAID algorithm has proven to be an effective approach for obtaining a quick but meaningful segmentation where segments are defined in terms of demographic or other variables that are predictive of a single categorical criterion (dependent) variable. However, response data may contain ratings or purchase history on several products, or, in discrete choice experiments, preferences among alternatives in each of several choice sets. We propose an efficient hybrid methodology combining features of CHAID and latent class modeling (LCM) to build a classification tree that is predictive of multiple criteria. The resulting method provides an alternative to the standard method of profiling latent classes in LCM through the inclusion of (active) covariates.

Keywords

Latent Class Latent Class Analysis Discrete Choice Experiment Latent Class Modeling National Election Study 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. BURNS, N., KINDER, D.R., ROSENSTONE, S.J., SAPIRO, V., and the National Election Studies (2001): National Election Studies, 2000: Pre-/Post-Election Study [dataset id:2000.T]. Ann Arbor, MI: University of Michigan, Center for Political Studies [producer and distributor].Google Scholar
  2. GOODMAN, L.A. (1974): Exploratory latent structure analysis using both identifiable and unidentifiable models. Biometrika, 61, 215–231.MATHMathSciNetCrossRefGoogle Scholar
  3. KASS, G. (1980): An exploratory technique for investigating large quantities of categorical data. Applied Statistics, 29, 119–127.Google Scholar
  4. KIM, S.J. and Lee, K.B. (2003): Constructing decision trees with multiple response variables. International Journal of Management and Decision Making, 4, 289–311.CrossRefGoogle Scholar
  5. LAZARSFELD, P. F. and HENRY, N.W. (1968): Latent structure analysis. Houghton Mifflin, Boston.Google Scholar
  6. MAGIDSON, J. (1993): The use of the new ordinal algorithm in CHAID to target profitable segments. The Journal of Database Marketing, 1, 29–48.Google Scholar
  7. MAGIDSON, J. and VERMUNT, J.K. (2001): Latent class factor and cluster models, bi-plots and related graphical displays. Sociological Methodology, 31, 223–264.CrossRefGoogle Scholar
  8. VERMUNT, J.K. and MAGIDSON, J. (2001): Latent class analysis with sampling weights. Paper presented at the 6th annual meeting of the Methodology Section of the American Sociological Association, University of Minnesota, May 4–5, 2001.Google Scholar
  9. VERMUNT, J.K. and MAGIDSON, J. (2002): Latent class cluster analysis. In: J.A. Hagenaars and A.L. McCutcheon (Eds.): Applied latent class analysis. Cambridge University Press, Cambridge, 89–106.Google Scholar
  10. VERMUNT, J. K. and MAGIDSON, J. (2005): Latent GOLD 4.0 User Manual. Statistical Innovations Inc, Belmont MA.Google Scholar

Copyright information

© Springer-Verlag Berlin · Heidelberg 2005

Authors and Affiliations

  • Jay Magidson
    • 1
  • Jeroen K. Vermunt
    • 2
  1. 1.Statistical Innovations Inc.BelmontUSA
  2. 2.Department of Methodology and StatisticsTilburg UniversityTilburgNetherlands

Personalised recommendations