Abstract
One of the dimension reduction (DR) methods for data-visualization, t-distributed stochastic neighbor embedding (t-SNE), has drawn increasing attention. t-SNE gives us better visualization than conventional DR methods, by relieving so-called crowding problem. The crowding problem is one of the curses of dimensionality, which is caused by discrepancy between high and low dimensional spaces. However, in t-SNE, it is assumed that the strength of the discrepancy is the same for all samples in all datasets regardless of ununiformity of distributions or the difference in dimensions, and this assumption sometimes ruins visualization. Here we propose a new DR method inhomogeneous t-SNE, in which the strength is estimated for each point and dataset. Experimental results show that such pointwise estimation is important for reasonable visualization and that the proposed method achieves better visualization than the original t-SNE.
Keywords
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Tenenbaum, J.B., De Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290(5500), 2319–2323 (2000)
Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000)
Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput. 15(6), 1373–1396 (2003)
Hinton, G.E., Roweis, S.T.: Stochastic neighbor embedding. In: Advances in Neural Information Processing Systems, vol. 15, pp. 833–840. MIT Press, Cambridge (2002)
Lafon, S., Lee, A.B.: Diffusion maps and coarse-graining: a unified framework for dimensionality reduction, graph partitioning, and data set parameterization. IEEE Trans. Pattern Anal. Mach. Intell. 28(9), 1393–1403 (2006)
van der Maaten, L., Hinton, G.: Visualizing data using t-SNE. J. Mach. Learn. Res. 9, 2579–2605 (2008)
van der Maaten, L.: Learning a parametric embedding by preserving local structure. In: International Conference on Artificial Intelligence and Statistics, JMLR W&CP, vol. 5 (2009)
Vladymyrov, M., Carreira-Perpinán, M.: Entropic affinities: properties and efficient numerical computation. In: Proceedings of the 30th International Conference on Machine Learning, pp. 477–485 (2013)
Vladymyrov, M., Carreira-Perpinán, M.A.: Linear-time training of nonlinear low-dimensional embeddings. In: Proceedings of AISTATS 2014, International Conference on Artificial Intelligence and Statistics, JMLR W&CP, vol. 33, pp. 968–977 (2014)
van der Maaten, L.: Barnes-hut-sne. arXiv preprint arXiv:1301.3342 (2013)
van der Maaten, L.: Accelerating t-sne using tree-based algorithms. J. Mach. Learn. Res. 15(1), 3221–3245 (2014)
Parviainen, E.: A Graph-based n-body Approximation with Application to Stochastic Neighbor Embedding. Neural Netw. 75, 1–11 (2016)
Acknowledgements
This work was partially supported by the Grant for Enhancement of International Research from Kobe University, and Grants-in-Aid for Young Scientists (B) [No. 15K16064 (J.K.)] from the MEXT of Japan.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing AG
About this paper
Cite this paper
Kitazono, J., Grozavu, N., Rogovschi, N., Omori, T., Ozawa, S. (2016). t-Distributed Stochastic Neighbor Embedding with Inhomogeneous Degrees of Freedom. In: Hirose, A., Ozawa, S., Doya, K., Ikeda, K., Lee, M., Liu, D. (eds) Neural Information Processing. ICONIP 2016. Lecture Notes in Computer Science(), vol 9949. Springer, Cham. https://doi.org/10.1007/978-3-319-46675-0_14
Download citation
DOI: https://doi.org/10.1007/978-3-319-46675-0_14
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-46674-3
Online ISBN: 978-3-319-46675-0
eBook Packages: Computer ScienceComputer Science (R0)