Extreme Learning Machines for VISualization+R: Mastering Visualization with Target Variables
The current paper presents an improvement of the Extreme Learning Machines for VISualization (ELMVIS+) nonlinear dimensionality reduction method. In this improved method, called ELMVIS+R, it is proposed to apply the originally unsupervised ELMVIS+ method for the regression problems, using target values to improve visualization results. It has been shown in previous work that the approach of adding supervised component for classification problems indeed allows to obtain better visualization results. To verify this assumption for regression problems, a set of experiments on several different datasets was performed. The newly proposed method was compared to the ELMVIS+ method and, in most cases, outperformed the original algorithm. Results, presented in this article, prove the general idea that using supervised components (target values) with nonlinear dimensionality reduction method like ELMVIS+ can improve both visual properties and overall accuracy.
KeywordsNonlinear regression Machine learning Artificial neural networks Extreme learning machines Visualization Nonlinear dimensionality reduction Cosine similarity
Compliance with Ethical Standards
Conflict of interests
The authors declare that they have no conflict of interest.
This article does not contain any studies with human participants or animals performed by any of the authors.
- 2.Gisbrecht A., Hammer B. Data visualization by nonlinear dimensionality reduction. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery 2015;5(2):51–73.Google Scholar
- 5.Padmaja DL, Vishnuvardhan B. Comparative study of feature subset selection methods for dimensionality reduction on scientific data. 2016 IEEE 6th International Conference on Advanced Computing (IACC); 2016. p. 31–34.Google Scholar
- 6.Torabi A., Zareayan Jahromy F., Daliri M.R. 2017. Semantic category-based classification using nonlinear features and wavelet coefficients of brain signals. Cognitive Computation.Google Scholar
- 11.Dornaika F, Assoum A. Linear Dimensionality Reduction through Eigenvector Selection for Object Recognition. Springer Berlin Heidelberg, Berlin, Heidelberg; 2010. p. 276–285.Google Scholar
- 15.Ye J. Dimension Reduction Algorithms in Data Mining, with Applications. Minneapolis: PhD thesis, University of Minnesota; 2005. AAI3172868.Google Scholar
- 20.Dablemont S, Simon G, Lendasse A, Ruttiens A, Blayo F, Verleysen M. Time series forecasting with SOM and local non-linear models - application to the DAX30 index prediction. Proceedings of the Workshop on Self-organizing Maps, Hibikino, Japan; 2003. p. 340–345.Google Scholar
- 23.Belkin M., Niyogi P. Laplacian eigenmaps and spectral techniques for embedding and clustering. Adv Neural Inf Proces Syst 2001;14:585–591.Google Scholar
- 26.Wold H. Estimation of principal components and related models by iterative least squares. In Multivariate Analysis. Volume 59. Academic Press, NY. In: Krishnaiah P, editors; 1966. p. 391– 420.Google Scholar
- 27.Lendasse A, Corona F. Linear projection based on noise variance estimation: Application to spectral data. Proceedings of ESANN 2008, European Symposium on Artificial Neural Networks, Bruges (Belgium), d-side publ. (Evere, Belgium). In: Verleysen M, editors; 2008. p. 457–462.Google Scholar
- 28.Akusok A., Miche Y., Björk K.M., Nian R., Lauren P., Lendasse A. ELMVIS+: improved nonlinear visualization technique using cosine distance and extreme learning machines. Proceedings of ELM-2015 Volume 2: Theory, Algorithms and Applications (II). Springer International Publishing; 2016. p. 357–369.Google Scholar
- 31.Gritsenko A, Akusok A, Miche Y, Björk KM, Baek S, Lendasse A. Combined Nonlinear Visualization and Classification: ELMVIS++C. International Joint Conference on Neural Networks (IJCNN 2016), IEEE; 2016. p. 2617–2624.Google Scholar
- 34.Burkard R., Dell’Amico M., Martello S. 2012. Assignment problems. Society for Industrial and Applied Mathematics.Google Scholar
- 36.Venna J., Peltonen J., Nybo K., Aidos H., Kaski S. Information retrieval perspective to nonlinear dimensionality reduction for data visualization. J Mach Learn Res 2010;11:451–490.Google Scholar
- 38.Lichman M. 2013. UCI machine learning repository. http://archive.ics.uci.edu/ml.
- 40.Ortigosa I, Lopez R, Garcia J. A neural networks approach to residuary resistance of sailing yachts prediction. Proceedings of the international conference on marine engineering MARINE. Volume 2007; 2007. p. 250.Google Scholar
- 42.Kaya H, Tüfekci P, Gürgen FS. 2012. Local and global learning methods for predicting power of a combined gas & steam turbine.Google Scholar
- 46.Akusok A. 2016. ELMVIS+ code. https://github.com/akusok/elmvis.