Abstract
Modern data analysis is flush with large databases and high-dimensional inference problems. Often the size of these problems directly relates to the fact that given data may have variations that can be difficult to incorporate into well-known, classical methods. One of these sources of variation is that of differing data sources, often called domain adaptation. Many domain adaptation techniques use the notion of a shared representation to attempt to remove domain-specific variation in given observations. One way to obtain these representations is through dimension reduction using a linear projection onto a lower-dimensional subspace of the predictors. Estimating linear projections is intrinsically linked with parameters lying on the Grassmannian. We present some historical approaches and their relationship to recent advances in domain adaptation that exploit the Grassmannian through subspace estimation.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bhattacharya R, Patragenaru V (2003) Large sample theory of intrinsic and extrinsic sample means on manifolds I. Ann Stat. 31(1):1–29
Blitzer J, McDonald R, Pereira F (2006) Domain adaptation with structural correspondence learning. In: Proceedings of the 2006 conference on empirical methods in natural language processing - EMNLP ‘06. Association for Computational Linguistics, July 2006
Chikuse Y (2003) Statistics on special manifolds. Lecture notes in statistics. Springer, Berlin
Cook RD, Adragni KP (2009) Sufficient dimension reduction and prediction in regression. Philos Trans R Soc A Math Phys Eng Sci 367(1906):4385–4405
Cook RD, Forzani L (2009) Likelihood-based sufficient dimension reduction. J Am Stat Assoc 104(485):197–208
Daumé III H (2007) Frustratingly easy domain adaptation. In: Annual meeting—association for computational linguistics, vol 45, pp 256–263
DerSimonian R, Laird N (1986) Meta-analysis in clinical trials. Control Clin Trials 7(3):177–188
Dryden IL, Mardia KV (1998) Statistical shape analysis. Wiley, New York
Edelman A, Arias TA, Smith ST (1998) The geometry of algorithms with orthogonality constraints. SIAM J Matrix Anal Appl 20:303–353
FG-NET Aging Database (2011) Face and gesture recognition research network. http://www.fgnet.rsunit.com/. Accessed April 2011
Fu Y, Huang TS (2008) Human age estimation with regression on discriminative aging manifold. IEEE Trans Multimedia 10(4):578–584
Gong B, Shi Y, Sha F, Grauman K (2012) Geodesic flow kernel for unsupervised domain adaptation. In: IEEE conference on computer vision and pattern recognition
Goodall CR, Mardia KV (1999) Projective shape analysis. J Comput Graph Stat 8(2):143–168
Gopalan R, Li R, Chellappa R (2011) Domain adaptation for object recognition: an unsupervised approach. In: International conference on computer vision
Heckman JJ (1979) Sample selection bias as a specification error. Econometrica J Econ Soc 47(1):153–161
Ho HT, Gopalan R (2014) Model-driven domain adaptation on product manifolds for unconstrained face recognition. Int J Comput Vis 109(1–2):110–125
Huang J, Mumford DB (1999) Statistics of natural images and models. In: IEEE Computer Society (ed) Proceedings of the IEEE computer society conference on computer vision and pattern recognition, vol 1. IEEE Computer Society Press, Los Alamitos, CA, pp 541–547
Huang J, Smola AJ, Gretton A, Borgwardt KM, Schölkopf B (2007) Correcting sample selection bias by unlabeled data. Adv Neural Inf Process Syst 19:601
Jolliffe IT (2005) Principal component analysis. Wiley Online Library, New York
Kulis B, Saenko K, Darrell T (2011) What you saw is not what you get: domain adaptation using asymmetric kernel transforms. In: IEEE conference on computer vision and pattern recognition
Lee JA, Verleysen M (2007) Nonlinear dimensionality reduction. Springer, Berlin
Li K-C (1991) Sliced inverse regression for dimension reduction. J Am Stat Assoc 86(141):316–327
Liu J, Shah M, Kuipers B, Savarese S (2011) Cross-view action recognition via view knowledge transfer. In: IEEE conference on computer vision and pattern recognition
Mahalanobis PC (1944) On large-scale sample surveys. Philos Trans R Soc Lond B 231:329–451
Pan SJ, Yang Q (2010) A survey on transfer learning. IEEE Trans Knowl Data Eng 22(10):1345–1359
Radhakrishna Rao C (1948) The utilization of multiple measurements in problems of biological classification. J R Stat Soc Ser B 10(2):159–203
Roweis ST, Saul LK (2000) Nonlinear dimensionality reduction by locally linear embedding. Science 290:2323–2326
Ruderman DL, Bialek W (1994) Statistics of natural images: scaling in the woods. Phys Rev Lett 73(6):814–817
Saenko K, Kulis B, Fritz M, Darrell T (2010) Adapting visual category models to new domains. In: Daniilidis K, Maragos P, Paragios N (eds) European conference on computer vision 2010. Lecture notes in computer science. Springer, Berlin/Heidelberg, pp 213–226
Shaw DA, Chellappa R (2013) Regression on manifolds using data-dependent regularization with applications in computer vision. Stat Anal Data Min. (Special issue: joint statistical meetings 2012) 6(6):519–528
Shaw DA, Chellappa R (2015) Combined direction estimation for dimension reduction in the presence of inhomogeneous data. J Am Stat Assoc. Under review
Shaw DA, Chellappa R (2014) Domain adaptation for robust pattern recognition. In: Information theory and applications workshop (ITA), 2014, pp 1–7, Feb 2014
Shekhar S, Patel V, Nguyen H, Chellappa R (2013) Generalized domain-adaptive dictionaries. In: IEEE conference on computer vision and pattern recognition
Shi Y, Sha F (2012) Information-theoretical learning of discriminative clusters for unsupervised domain adaptation. In: International conference on machine learning
Shimodaira H (2000) Improving predictive inference under covariate shift by weighting the log-likelihood function. J Stat Plann Inference 90(2):227–244
Spivak M (1999) A comprehensive introduction to differential geometry vol 1, 3rd edn. Publish or Perish Inc., Houston, TX
Sugiyama M, Suzuki T, Nakajima S, Kashima H, von Bünau P, Kawanabe M (2008) Direct importance estimation for covariate shift adaptation. Ann Inst Stat Math 60(4):699–746
Tenenbaum JB, de Silva V, Langford JC (2000) A global geometric framework for nonlinear dimensionality reduction. Science (New York, NY) 290(5500):2319–23
Thorpe JA (1979) Elementary topics in differential geometry. Springer, New York
Turaga P, Biswas S, Chellappa R (2010) The role of geometry for age estimation. In: IEEE international conference on acoustics, speech and signal processing. University of Michigan Library, Ann Arbor, pp 946–949
Turaga PK, Veeraraghavan A, Srivastava A, Chellappa R (2011) Statistical computations on grassmann and stiefel manifolds for image and video-based recognition. IEEE Trans Pattern Anal Mach Intell 33(11):2273–2286
Vardi Y (1982) Nonparametric estimation in the presence of length bias. Ann Stat 10(2):616–620
Wold H (1985) Partial least squares. In: Kotz S, Johnson N (eds) Encyclopedia of statistical sciences. Wiley, New York, pp 581–591
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Shaw, D.A., Chellappa, R. (2016). Domain Adaptation Using the Grassmann Manifold. In: Turaga, P., Srivastava, A. (eds) Riemannian Computing in Computer Vision. Springer, Cham. https://doi.org/10.1007/978-3-319-22957-7_15
Download citation
DOI: https://doi.org/10.1007/978-3-319-22957-7_15
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-22956-0
Online ISBN: 978-3-319-22957-7
eBook Packages: EngineeringEngineering (R0)