Comparative Analysis Between Embedded-Spaces-Based and Kernel-Based Approaches for Interactive Data Representation

Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 885)


This work presents a comparative analysis between the linear combination of em-bedded spaces resulting from two approaches: (1) The application of dimensional reduction methods (DR) in their standard implementations, and (2) Their corresponding kernel-based approximations. Namely, considered DR methods are: CMDS (Classical Multi- Dimensional Scaling), LE (Laplacian Eigenmaps) and LLE (Locally Linear Embedding). This study aims at determining -through objective criteria- what approach obtains the best performance of DR task for data visualization. The experimental validation was performed using four databases from the UC Irvine Machine Learning Repository. The quality of the obtained embedded spaces is evaluated regarding the \({\varvec{R_{NX}(K)}}\) criterion. The \({\varvec{R_{NX}(K)}}\) allows for evaluating the area under the curve, which indicates the performance of the technique in a global or local topology. Additionally, we measure the computational cost for every comparing experiment. A main contribution of this work is the provided discussion on the selection of an interactivity model when mixturing DR methods, which is a crucial aspect for information visualization purposes.


Artificial intelligence Dimensionality reduction methods Kernel Kernel PCA CMDS LLE LE 



This work is supported by the “Smart Data Analysis Systems - SDAS” group (, as well as the “Grupo de Investigación en Ingeniería Eléctrica y Electrónica - GIIEE” from Universidad de Nariño. Also, the authors acknowledge to the research project supported by Agreement No. 095 November 20th, 2014 by VIPRI from Universidad de Nariño.


  1. 1.
    Sacha, D., et al.: Visual interaction with dimensionality reduction: a structured literature analysis. IEEE Trans. Vis. Comput. Graph. 23(1), 241–250 (2017)MathSciNetCrossRefGoogle Scholar
  2. 2.
    Peluffo Ordoñez, D.H., Lee, J.A., Verleysen, M.: Recent methods for dimensionality reduction: a brief comparative analysis. In: 2014 European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN 2014) (2014)Google Scholar
  3. 3.
    Peluffo-Ordóñez, D.H., Castro-Ospina, A.E., Alvarado-Pérez, J.C., Revelo-Fuelagán, E.J.: Multiple kernel learning for spectral dimensionality reduction. Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications. LNCS, vol. 9423, pp. 626–634. Springer, Cham (2015). Scholar
  4. 4.
    Belanche Muñoz, L.A.: Developments in kernel design. In: ESANN 2013 Proceedings: European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning: Bruges (Belgium), 24–26 April 2013, pp. 369–378 (2013)Google Scholar
  5. 5.
    Borg, I., Groenen, P.J.: Modern Multidimensional Scaling: Theory and Applications. Springer, New York (2005). Scholar
  6. 6.
    Lee, J.A., Verleysen, M.: Quality assessment of dimensionality reduction: rank-based criteria. Neurocomputing 72(7–9), 1431–1443 (2009)CrossRefGoogle Scholar
  7. 7.
    Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput. 15(6), 1373–1396 (2003)CrossRefGoogle Scholar
  8. 8.
    Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000)CrossRefGoogle Scholar
  9. 9.
    Peluffo-Ordóñez, D.H., Lee, J.A., Verleysen, M.: Generalized kernel framework for unsupervised spectral methods of dimensionality reduction. In: 2014 IEEE Symposium on Computational Intelligence and Data Mining (CIDM), pp. 171–177. IEEE (2014)Google Scholar
  10. 10.
    Gijón Gómez, J.: Visualización bidimensional de problemas de clasificación en alta dimensión. B.S. thesis (2013)Google Scholar
  11. 11.
    Ham, J., Lee, D.D., Mika, S., Schölkopf, B.: A kernel view of the dimensionality reduction of manifolds. In: Proceedings of the Twenty-First International Conference on Machine Learning, p. 47. ACM (2004)Google Scholar
  12. 12.
    Cristianini, N., Shawe-Taylor, J.: An Introduction to Support Vector Machines and Other Kernel-Based Learning Methods. Cambridge University Press, Cambridge (2000)CrossRefGoogle Scholar
  13. 13.
    Lee, J.A., Renard, E., Bernard, G., Dupont, P., Verleysen, M.: Type 1 and 2 mixtures of kullback-leibler divergences as cost functions in dimensionality reduction based on similarity preservation. Neurocomputing 112, 92–108 (2013)CrossRefGoogle Scholar
  14. 14.
    Cook, J., Sutskever, I., Mnih, A., Hinton, G.: Visualizing similarity data with a mixture of maps. In: Artificial Intelligence and Statistics, pp. 67–74 (2007)Google Scholar
  15. 15.
    Nene, S.A., Nayar, S.K., Murase, H., et al.: Columbia object image library (coil-20) (1996)Google Scholar
  16. 16.
    Chen, L., Buja, A.: Local multidimensional scaling for nonlinear dimension reduction, graph drawing, and proximity analysis. J. Am. Stat. Assoc. 104(485), 209–219 (2009)MathSciNetCrossRefGoogle Scholar
  17. 17.
    France, S., Carroll, D.: Development of an agreement metric based upon the RAND index for the evaluation of dimensionality reduction techniques, with applications to mapping customer data. In: Perner, P. (ed.) MLDM 2007. LNCS (LNAI), vol. 4571, pp. 499–517. Springer, Heidelberg (2007). Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.Universidad de NariñoPastoColombia
  2. 2.Universidad Nacional, sede ManizalesManizalesColombia
  3. 3.Corporación Universitaria Autónoma de NariñoPastoColombia
  4. 4.Yachay TechUrcuquíEcuador

Personalised recommendations