Spatial Representation of Dissimilarity Data via Lower-Complexity Linear and Nonlinear Mappings

  • Elżbieta Pekalska
  • Robert P. W. Duin
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2396)


Dissimilarity representations are of interest when it is hard to define well-discriminating features for the raw measurements. For an exploration of such data, the techniques of multidimensional scaling (MDS) can be used. Given a symmetric dissimilarity matrix, they find a lower-dimensional configuration such that the distances are preserved. Here, Sammon nonlinear mapping is considered. In general, this iterative method must be recomputed when new examples are introduced, but its complexity is quadratic in the number of objects in each iteration step. A simple modification to the nonlinear MDS, allowing for a significant reduction in complexity, is therefore considered, as well as a linear projection of the dissimilarity data. Now, generalization to new data can be achieved, which makes it suitable for solving classification problems. The linear and nonlinear mappings are then used in the setting of data visualization and classification. Our experiments show that the nonlinear mapping can be preferable for data inspection, while for discrimination purposes, a linear mapping can be recommended. Moreover, for the spatial lower-dimensional representation, a more global, linear classifier can be built, which outperforms the local nearest neighbor rule, traditionally applied to dissimilarities.


Spatial Representation Near Neighbor Linear Projection Intrinsic Dimensionality Near Neighbor Method 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    I. Borg and P. Groenen. Modern Multidimensional Scaling. Springer-Verlag, New York, 1997.zbMATHGoogle Scholar
  2. 2.
    D. Cho and D. J. Miller. A Low-complexity Multidimensional Scaling Method Based on Clustering. concept paper, 2002.Google Scholar
  3. 3.
    R. O. Duda, P. E. Hart, and D. G. Stork. Pattern Classification. John Wiley & Sons, Inc., 2nd edition, 2001.Google Scholar
  4. 4.
    L. Goldfarb. A new approach to pattern recognition. In L.N. Kanal and A. Rosen-feld, editors, Progress in Pattern Recognition, volume 2, pages 241–402. Elsevier Science Publishers B.V., 1985.Google Scholar
  5. 5.
    W. Greub. Linear Algebra. Springer-Verlag, 1975.Google Scholar
  6. 6.
    D. W. Jacobs, D. Weinshall, and Y. Gdalyahu. Classification with Non-Metric Distances: Image Retrievaland Class Representation. IEEE Trans. on PAMI, 22(6):583–600, 2000.Google Scholar
  7. 7.
    A. K. Jain and D. Zongker. Representation and recognition of handwritten digits using deformable templates. IEEE Trans. on PAMI, 19(12):1386–1391, 1997.Google Scholar
  8. 8.
    Dubuisson M. P. and Jain A. K. Modified Hausdorff distance for object matching. In 12th Int. Conf. on Pattern Recognition, volume 1, pages 566–568, 1994.CrossRefGoogle Scholar
  9. 9.
    E. Pekalska, P. Paclík, and R. P. W. Duin. A Generalized Kernel Approach to Dissimilarity Based Classification. J. of Mach. Learn. Research, 2:175–211, 2001.CrossRefGoogle Scholar
  10. 10.
    J. W. Sammon. A nonlinear mapping for data structure analysis. IEEE Transaction on Computers, C-18:401–409, 1969.CrossRefGoogle Scholar
  11. 11.
    C. L. Wilson and M. D. Garris. Handprinted character database 3. Technical report, National Institute of Standards and Technology, February 1992.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2002

Authors and Affiliations

  • Elżbieta Pekalska
    • 1
  • Robert P. W. Duin
    • 1
  1. 1.Pattern Recognition Group, Department of Applied Physics Faculty of Applied SciencesDelft University of TechnologyDelftThe Netherlands

Personalised recommendations