Dimensionality reduction with adaptive graph
- 119 Downloads
Graph-based dimensionality reduction (DR) methods have been applied successfully in many practical problems, such as face recognition, where graphs play a crucial role in modeling the data distribution or structure. However, the ideal graph is, in practice, difficult to discover. Usually, one needs to construct graph empirically according to various motivations, priors, or assumptions; this is independent of the subsequent DR mapping calculation. Different from the previous works, in this paper, we attempt to learn a graph closely linked with the DR process, and propose an algorithm called dimensionality reduction with adaptive graph (DRAG), whose idea is to, during seeking projection matrix, simultaneously learn a graph in the neighborhood of a prespecified one. Moreover, the pre-specified graph is treated as a noisy observation of the ideal one, and the square Frobenius divergence is used to measure their difference in the objective function. As a result, we achieve an elegant graph update formula which naturally fuses the original and transformed data information. In particular, the optimal graph is shown to be a weighted sum of the pre-defined graph in the original space and a new graph depending on transformed space. Empirical results on several face datasets demonstrate the effectiveness of the proposed algorithm.
KeywordsDimensionality reduction graph construction face recognition
Unable to display preview. Download preview PDF.
- 1.Duda R O, Hart P E, Stork D G. Pattern classification. Wileyinterscience, 2012Google Scholar
- 2.He X F, Niyogi P. Locality preserving projections. In: Thrun S, Saul L, Schölkopf B, eds. Advances in Neural Information Processing Systems 16. Cambridge: MIT Press, 2004Google Scholar
- 3.He X F, Cai D, Yan S C, Zhang H J. Neighborhood preserving embedding. In: Proceedings of the 10th IEEE International Conference on Computer Vision. 2005, 1208–1213Google Scholar
- 9.Magdalinos P. Linear and non linear dimensionality reduction for distributed knowledge discovery. Department of Informatics, Athens University of Economics and Business, 2011Google Scholar
- 13.Van-der-Maaten L, Postma E, Van-den-Herik H. Dimensionality reduction: a comparative review. Journal of Machine Learning Research, 2009, 10: 1–41Google Scholar
- 17.Carreira-Perpinán M A, Zemel R S. Proximity graphs for clustering and manifold learning. Advances in Neural Information Processing Systems, 2005, 17: 225–232Google Scholar
- 18.Jebara T, Wang J, Chang S F. Graph construction and b-matching for semi-supervised learning. In: Proceedings of the 26th Annual International Conference on Machine Learning. 2009, 441–448Google Scholar
- 20.Yan S, Wang H. Semi-supervised learning by sparse representation. In: Proceedings of the SIAM International Conference on Data Mining (SDM2009). 2009, 792–801Google Scholar
- 27.Cai D, He X F, Han J W. Semi-supervised discriminant analysis. In: Proceedings of the 11th IEEE International Conference on Computer Vision (ICCV 2007). 2007, 1–7Google Scholar
- 28.Wu M, Yu K, Yu S, Schölkopf B. Local learning projections. In: Proceedings of the 24th International Conference on Machine Learning. 2007, 1039–1046Google Scholar
- 29.Magdalinos P, Doulkeridis C, Vazirgiannis M. FEDRA: a fast and efficient dimensionality reduction algorithm. In: Proceedings of the SIAM International Conference on Data Mining (SDM2009). 2009, 509–520Google Scholar