Generalized Low-Computational Cost Laplacian Eigenmaps

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11314)


Dimensionality reduction (DR) is a methodology used in many fields linked to data processing, and may represent a preprocessing stage or be an essential element for the representation and classification of data. The main objective of DR is to obtain a new representation of the original data in a space of smaller dimension, such that more refined information is produced, as well as the time of the subsequent processing is decreased and/or visual representations more intelligible for human beings are generated. The spectral DR methods involve the calculation of an eigenvalue and eigenvector decomposition, which is usually high-computational-cost demanding, and, therefore, the task of obtaining a more dynamic and interactive user-machine integration is difficult. Therefore, for the design of an interactive IV system based on DR spectral methods, it is necessary to propose a strategy to reduce the computational cost required in the calculation of eigenvectors and eigenvalues. For this purpose, it is proposed to use locally linear submatrices and spectral embedding. This allows integrating natural intelligence with computational intelligence for the representation of data interactively, dynamically and at low computational cost. Additionally, an interactive model is proposed that allows the user to dynamically visualize the data through a weighted mixture.


Dimensionality reduction Generalized methodology Kernel approximations Low-computational cost Multiple kernel learning Spectral methods 



The authors acknowledge to the research project “Desarrollo de una metodología de visualización interactiva y eficaz de información en Big Data" supported by Agreement No. 180 November 1st, 2016 by VIPRI from Universidad de Nariño. Authors thank the valuable support given by the SDAS Research Group (


  1. 1.
    Borg, I., Groenen, P.J.: Modern Multidimensional Scaling: Theory and Applications. Springer, New York (2005). Scholar
  2. 2.
    Salazar-Castro, J.A., et al.: Dimensionality reduction for interactive data visualization via a geo-desic approach. In: 2016 IEEE Latin American Conference on Computational Intelligence (LA-CCI), pp. 1–6. IEEE (2016)Google Scholar
  3. 3.
    Salazar-Castro, J.A., et al.: A novel color-based data visualization approach using a circular interaction model and dimensionality reduction. In: Huang, T., Lv, J., Sun, C., Tuzikov, A.V. (eds.) ISNN 2018. LNCS, vol. 10878, pp. 557–567. Springer, Cham (2018). Scholar
  4. 4.
    Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput. 15(6), 1373–1396 (2003)CrossRefGoogle Scholar
  5. 5.
    Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000)CrossRefGoogle Scholar
  6. 6.
    Aggarwal, C.C.: Outlier analysis. In: Aggarwal, C.C. (ed.) Data Mining, pp. 237–263. Springer, Heidelberg (2015). Scholar
  7. 7.
    Langone, R., Alzate, C., Suykens, J.A.: Kernel spectral clustering with memory effect. Phys. A: Stat. Mech. Appl. 392(10), 2588–2606 (2013)MathSciNetCrossRefGoogle Scholar
  8. 8.
    Salazar-Castro, J., Rosas-Narváez, Y., Pantoja, A., Alvarado-Pérez, J.C., Peluffo-Ordóñez, D.H.: Interactive interface for efficient data visualization via a geometric approach. In: 2015 20th Symposium on Signal Processing, Images and Computer Vision (STSIVA), pp. 1–6. IEEE (2015)Google Scholar
  9. 9.
    Rosero-Montalvo, P., et al.: Interactive data visualization using dimensionality reduction and similarity-based representations. In: Beltrán-Castañón, C., Nyström, I., Famili, F. (eds.) CIARP 2016. LNCS, vol. 10125, pp. 334–342. Springer, Cham (2017). Scholar
  10. 10.
    Ham, J., Lee, D.D., Mika, S., Schölkopf, B.: A Kernel view of the dimensionality reduction of manifolds. In: Proceedings of the Twenty-First International Conference on Machine Learning, p. 47. ACM (2004)Google Scholar
  11. 11.
    Peluffo-Ordóñez, D.H., Lee, J.A., Verleysen, M.: Generalized Kernel framework for unsupervised spectral methods of dimensionality reduction. In: 2014 IEEE Symposium on Computational Intelligence and Data Mining (CIDM), pp. 171–177. IEEE (2014)Google Scholar
  12. 12.
    Lee, J.A., Renard, E., Bernard, G., Dupont, P., Verleysen, M.: Type 1 and 2 mixtures of Kullback-Leibler divergences as cost functions in dimensionality reduction based on similarity preservation. Neurocomputing 112, 92–108 (2013)CrossRefGoogle Scholar
  13. 13.
    Saul, L.K., Weinberger, K.Q., Ham, J.H., Sha, F., Lee, D.D.: Spectral methods for dimensionality reduction. In: Semisupervised Learning, pp. 293–308 (2006)Google Scholar
  14. 14.
    Vladymyrov, M., Carreira-Perpiñán, M.Á.: Locally linear landmarks for large-scale manifold learning. In: Blockeel, H., Kersting, K., Nijssen, S., Železný, F. (eds.) ECML PKDD 2013. LNCS (LNAI), vol. 8190, pp. 256–271. Springer, Heidelberg (2013). Scholar
  15. 15.
    Cook, J., Sutskever, I., Mnih, A., Hinton, G.: Visualizing similarity data with a mixture of maps. In: Artificial Intelligence and Statistics, pp. 67–74 (2007)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.Universidad Nacional, sede ManizalesManizalesColombia
  2. 2.Corporación Universitaria Autónoma de NariñoPastoColombia
  3. 3.Universidad de NariñoPastoColombia
  4. 4.SDAS Research GroupYachay TechUrcuquíEcuador

Personalised recommendations