Statistics and Computing

, Volume 22, Issue 2, pp 445–454 | Cite as

DINDSCAL: direct INDSCAL

Article

Abstract

The well-known INDSCAL model for simultaneous metric multidimensional scaling (MDS) of three-way data analyzes doubly centered matrices of squared dissimilarities. An alternative approach, called for short DINDSCAL, is proposed for analyzing directly the input matrices of squared dissimilarities. An important consequence is that missing values can be easily handled. The DINDSCAL problem is solved by means of the projected gradient approach. First, the problem is transformed into a gradient dynamical system on a product matrix manifold (of Stiefel sub-manifold of zero-sum matrices and non-negative diagonal matrices). The constructed dynamical system can be numerically integrated which gives a globally convergent algorithm for solving the DINDSCAL. The DINDSCAL problem and its solution are illustrated by well-known data routinely used in metric MDS and INDSCAL. Alternatively, the problem can also be solved by iterative algorithm based on the conjugate (projected) gradient method, which MATLAB implementation is enclosed as an appendix.

Keywords

Three-way data Metric multidimensional scaling Missing values Dynamical system on matrix manifold Projected conjugate gradient 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Browne, M.W.: The Young-Householder algorithm and the least squares multidimensional scaling of squared distances. J. Classif. 4, 175–219 (1987) MathSciNetCrossRefGoogle Scholar
  2. Carroll, J.D., Chang, J.J.: Analysis of individual differences in multidimensional scaling via an n-way generalization of “Eckart-Young” decomposition. Psychometrika 35, 283–319 (1970) MATHCrossRefGoogle Scholar
  3. Glunt, W., Hayden, T.L., Liu, W.-M.: The embedding problem for predistance matrices. Bull. Math. Biol. 53, 769–796 (1991) MATHGoogle Scholar
  4. Edelman, A., Arias, T., Smith, S.T.: The geometry of algorithms with orthogonality constraints. SIAM J. Matrix Anal. Appl. 20, 303–353 (1998) MathSciNetMATHCrossRefGoogle Scholar
  5. Kelley, C.T.: Iterative Methods for Optimization. SIAM, Philadelphia (1999) MATHCrossRefGoogle Scholar
  6. Kearsley, A.J., Trosset, M.W., Tapia, R.A.: The solution of the metric STRESS and SSTRESS problems in multidimensional scaling using Newton’s method. Comput. Stat. 13, 369–396 (1998) MATHGoogle Scholar
  7. Mulaik, S.A.: Foundations of Factor Analysis. Chapman and Hall/CRC, Boca Raton (2010) MATHGoogle Scholar
  8. Schiffman, S.S., Reynolds, M.L., Young, F.W.: Introduction to Multidimensional Scaling: Theory, Methods and Applications. Academic Press, New Work (1981) MATHGoogle Scholar
  9. Takane, I., Young, F.W., de Leeuw, J.: Nonmetric individual differences multidimensional scaling: an alternating least squares method with optimal scaling features. Psychometrika 42, 7–67 (1977) MATHCrossRefGoogle Scholar
  10. Takane, Y., Jung, K., Hwang, H.: An acceleration method for ten Berge et al.’s algorithm for orthogonal INDSCAL. Comput. Stat. 25, 409–428 (2010) MathSciNetCrossRefGoogle Scholar
  11. Torgerson, W.S.: Theory and Methods of Scaling. Wiley, New York (1957) Google Scholar
  12. Trendafilov, N.T.: Orthonormality-constrained INDSCAL with nonnegative saliences. In: Laganà, A., et al. (eds.) Computational Science and Its Applications. Lecture Notes in Computer Science, vol. 3044, Part II, pp. 952–960. Springer, Berlin (2004) Google Scholar
  13. Trendafilov, N.T.: The dynamical system approach to multivariate data analysis, a review. J. Comput. Graph. Stat. 15, 628–650 (2006) MathSciNetCrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2011

Authors and Affiliations

  1. 1.Department of Mathematics and StatisticsThe Open UniversityMilton KeynesUK

Personalised recommendations