Skip to main content

Extended Regression on Manifolds Estimation

  • Conference paper
  • First Online:
Conformal and Probabilistic Prediction with Applications (COPA 2016)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 9653))

Included in the following conference series:

Abstract

Let f(X) be unknown smooth function which maps p-dimensional manifold-valued inputs X, whose values lie on unknown Input manifold M of lower dimensionality q < p embedded in an ambient high-dimensional space Rp, to m-dimensional outputs. Regression on manifold problem is to estimate a triple (f(X), Jf(X), M), which includes Jacobian Jf of the mapping f, from given sample consisting of ‘input-output’ pairs. If some mapping h transforms Input manifold M to q-dimensional Feature space Y h = h(M) and satisfies certain conditions, initial estimating problem can be reduced to Regression on feature space problem consisting in estimating of triple (gf(y), Jg,f(y), Y h) in which unknown function gf(y) depends on low-dimensional features y = h(X) and satisfies the condition gf(h(X)) ≈ f(X), and Jg,f is its Jacobian. The paper considers such Extended problem and presents geometrically motivated method for estimating both triples from given sample.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Vapnik, V.: Statistical Learning Theory. John Wiley, New-York (1998)

    MATH  Google Scholar 

  2. Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009)

    Book  MATH  Google Scholar 

  3. James, G., Witten, D., Hastie, T., Tibshirani, R.: An Introduction to Statistical Learning with Applications in R. Springer Texts in Statistics. Springer, New-York (2013)

    Book  MATH  Google Scholar 

  4. Bishop, C.M.: Pattern Recognition and Machine Learning. Springer, Heidelberg (2007)

    MATH  Google Scholar 

  5. Deng, L., Yu, D.: Deep Learning: Methods and Applications. NOW Publishers, Boston (2014)

    MATH  Google Scholar 

  6. Breiman, L.: Random forests. Mach. Learn. 45(1), 5–32 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  7. Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Ann. Stat. 29(5), 1189–1232 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  8. Rasmussen, C.E., Williams, C.: Gaussian Processes for Machine Learning. MIT Press, Cambridge (2006)

    MATH  Google Scholar 

  9. Loader, C.: Local Regression and Likelihood. Springer, New York (1999)

    MATH  Google Scholar 

  10. Wang, G.G., Shan, S.: Review of metamodeling techniques in support of engineering design optimization. J. Mech. Des. 129(3), 370–381 (2007)

    Article  Google Scholar 

  11. Forrester, A.I.J., Sobester, A., Keane, A.J.: Engineering Design via Surrogate Modelling: A Practical Guide. Wiley, New-York (2008)

    Book  Google Scholar 

  12. Kuleshov, A.P., Bernstein, A.V.: Cognitive technologies in adaptive models of complex plants. Inf. Control Probl. Manuf. 13(1), 1441–1452 (2009)

    Google Scholar 

  13. Stone, C.J.: Optimal rates of convergence for nonparametric estimators. Ann. Stat. 8, 1348–1360 (1980)

    Article  MathSciNet  MATH  Google Scholar 

  14. Stone, C.J.: Optimal global rates of convergence for nonparametric regression. Ann. Stat. 10, 1040–1053 (1982)

    Article  MathSciNet  MATH  Google Scholar 

  15. Rajaram, D., Pant, R.S.: An improved methodology for airfoil shape optimization using surrogate based design optimization. In: Rodrigues, H., et al. (eds.) Engineering Optimization IV, pp. 147–152. CRC Press, Taylor & Francis Group, London (2015)

    Google Scholar 

  16. Bernstein, A., Kuleshov, A., Sviridenko, Y., Vyshinsky, V.: Fast aerodynamic model for design technology. In: Proceedings of West-East High Speed Flow Field Conference (WEHSFF-2007), Moscow, Russia (2007). http://wehsff.imamod.ru/pages/s7.htm

  17. Zhu, F., Qin, N., Burnaev, E.V., Bernstein, A.V., Chernova, S.S.: Comparison of three geometric parameterization methods and their effect on aerodynamic optimization. In: Poloni, C. et al. (eds.) Eurogen 2011, Optimization and Control with Applications to Industrial and Societal Problems International Conference on Proceedings - Evolutionary and Deterministic Methods for Design, pp. 758–772. Sira, Capua, Italy (2011)

    Google Scholar 

  18. Seung, H.S., Lee, D.D.: The manifold ways of perception. Science 290(5500), 2268–2269 (2000)

    Article  Google Scholar 

  19. Levina, E., Bickel, P.J.: Maximum likelihood estimation of intrinsic dimension. In: Saul, L., Weiss, Y., Bottou, L. (eds.) Advances in Neural Information Processing Systems, vol. 17, pp. 777–784. MIT Press, Cambridge (2005)

    Google Scholar 

  20. Fan, M., Qiao, H., Zhang, B.: Intrinsic dimension estimation of manifolds by incising balls. Pattern Recogn. 42, 780–787 (2009)

    Article  MATH  Google Scholar 

  21. Fan, M., Gu, N., Qiao, H., Zhang, B.: Intrinsic dimension estimation of data by principal component analysis. In: arXiv:1002.2050v1 [cs.CV], 10 Feb 2010, pp. 1–8 (2010)

  22. Rozza, A., Lombardi, G., Rosa, M., Casiraghi, E., Campadelli, P.: IDEA: intrinsic dimension estimation algorithm. In: Maino, G., Foresti, G.L. (eds.) ICIAP 2011, Part I. LNCS, vol. 6978, pp. 433–442. Springer, Heidelberg (2011)

    Chapter  Google Scholar 

  23. Huo, X., Ni, X., Smith, A.K.: Survey of manifold-based learning methods. In: Liao, T.W., Triantaphyllou, E. (eds.) Recent Advances in Data Mining of Enterprise Data, pp. 691–745. World Scientific, Singapore (2007)

    Google Scholar 

  24. Ma, Y., Fu, Y. (eds.): Manifold Learning Theory and Applications. CRC Press, London (2011)

    Google Scholar 

  25. Bernstein, A., Kuleshov, A.: Low-dimensional data representation in data analysis. In: El Gayar, N., Schwenker, F., Suen, C. (eds.) ANNPR 2014. LNCS, vol. 8774, pp. 47–58. Springer, Heidelberg (2014)

    Google Scholar 

  26. Kuleshov, A., Bernstein, A.: Manifold learning in data mining tasks. In: Perner, P. (ed.) MLDM 2014. LNCS, vol. 8556, pp. 119–133. Springer, Heidelberg (2014)

    Google Scholar 

  27. Pelletier, B.: Nonparametric regression estimation on closed Riemannian manifolds. J. Nonparametric Stat. 18(1), 57–67 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  28. Loubes, J.-M., Pelletier, B.: A kernel-based classifier on a riemannian manifold. Statistics and Decisions 26(1), 35–51 (2008). Verlag, Oldenbourg

    Article  MathSciNet  MATH  Google Scholar 

  29. Bickel, P., Li, B.: Local polynomial regression on unknown manifolds. IMS Lecture notes - Monograph Series, vol. 54 ‘Complex Datasets and Inverse Problems: Tomography, Networks and Beyond,’ pp. 177–186 (2007)

    Google Scholar 

  30. Aswani, A., Bickel, P., Tomlin, C.: Regression on manifolds: Estimation of the exterior derivative. Ann. Stat. 39(1), 48–81 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  31. Cheng, M.-Y., Wu, H.-T.: Local linear regression on manifolds and its geometric interpretation. J. Am. Stat. Assoc. 108(504), 1421–1434 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  32. Yang, Y., Dunson, D.B.: Bayesian manifold regression. In: arXiv:1305.0167v2 [math.ST], 16 June 2014, pp. 1–40 (2014)

  33. Guhaniyogi, R., Dunson, D.B.: Compressed gaussian process. In: arXiv:1406.1916v1 [stat.ML], 7 June 2014, pp. 1–29 (2014)

  34. Fletcher, P.T.: Geodesic regression on Riemannian manifolds. In: Proceedings of International Workshop on Mathematical Foundations of Computational Anatomy (MFCA), pp. 75–86 (2011)

    Google Scholar 

  35. Hinkle, J., Muralidharan, P., Fletcher, P.T.: Polynomial regression on riemannian manifolds. In: arXiv:1201.2395v2 [math.ST], 1 Mar 2012, pp. 1–14 (2012)

    Google Scholar 

  36. Steinke, F., Hein, M., Schölkopf, B.: Nonparametric regression between general riemannian manifolds. SIAM J. Imaging Sci. 3(3), 527–563 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  37. Bernstein, A.V., Kuleshov, A.P.: Data-based manifold reconstruction via tangent bundle manifold learning. In: ICML-2014, Topological Methods for Machine Learning Workshop, Beijing, 25 June 2014. http://topology.cs.wisc.edu/KuleshovBernstein.pdf (2014)

  38. Kuleshov, A.P., Bernstein, A.V.: Cognitive technologies in adaptive models of complex plants. Inf. Control Probl. Manuf. 13(1), 1441–1452 (2009)

    Google Scholar 

  39. Bernstein, A.V., Kuleshov, A.P.: Tangent bundle manifold learning via Grassmann&Stiefel Eigenmaps. In: arXiv:1212.6031v1 [cs.LG], December 2012, pp. 1–25 (2012)

  40. Bernstein, A.V., Kuleshov, A.P.: Manifold learning: generalizing ability and tangent proximity. Int. J. Softw. Inf. 7(3), 359–390 (2013)

    Google Scholar 

  41. Bernstein, A., Kuleshov, A., Yanovich, Y.: Manifold Learning in Regression Tasks. In: Gammerman, A., Vovk, V., Papadopoulos, H. (eds.) SLDS 2015. LNCS, vol. 9047, pp. 414–423. Springer, Heidelberg (2015)

    Chapter  Google Scholar 

  42. Niyogi, P., Smale, S., Weinberger, S.: Finding the homology of submanifolds with high confidence from random samples. Discrete Comput. Geom. 39, 419–441 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  43. Kuleshov, A., Bernstein, A., Yanovich, Yu.: Asymptotically optimal method in Manifold estimation. In: Márkus, L., Prokaj, V. (eds.) Abstracts of the XXIX-th European Meeting of Statisticians, 20–25 July 2013, Budapest, p. 325 (2013)

    Google Scholar 

  44. Genovese, C.R., Perone-Pacifico, M., Verdinelli, I., Wasserman, L.: Minimax manifold estimation. J. Mach. Learn. Res. 13, 1263–1291 (2012)

    MathSciNet  MATH  Google Scholar 

  45. Golub, G.H., Van Loan, C.F.: Matrix Computation, 3rd edn. Johns Hopkins University Press, Baltimore, MD (1996)

    MATH  Google Scholar 

  46. Jollie, T.: Principal Component Analysis. Springer, New-York (2002)

    Google Scholar 

  47. Hamm, J., Lee, D.D.: Grassmann discriminant analysis: a unifying view on subspace-based learning. In: Proceedings of the 25th International Conference on Machine Learning (ICML 2008), pp. 376–383 (2008)

    Google Scholar 

  48. Wolf, L., Shashua, A.: Learning over sets using kernel principal angles. J. Mach. Learn. Res. 4, 913–931 (2003)

    MathSciNet  MATH  Google Scholar 

  49. Singer, A., Wu, H.-T.: Vector diffusion maps and the connection laplacian. Commun. Pure Appl. Math. 65(8), 1067–1144 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  50. Tyagi, H., Vural, E., Frossard, P.: Tangent space estimation for smooth embeddings of Riemannian manifold. In: arXiv:1208.1065v2 [stat.CO], 17 May 2013, pp. 1–35 (2013)

    Google Scholar 

  51. Kaslovsky, D.N., Meyer, F.G.: Non-asymptotic analysis of tangent space perturbation. Inf. Inf. J. IMA 3(2), 134–187 (2014)

    MathSciNet  MATH  Google Scholar 

  52. Wasserman, L.: All of Nonparametric Statistics. Springer Texts in Statistics, Berlin (2007)

    MATH  Google Scholar 

Download references

Acknowledgments

The study was performed in the IITP RAS exclusively by the grant from the Russian Science Foundation (project № 14-50-00150).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alexander Bernstein .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this paper

Cite this paper

Kuleshov, A., Bernstein, A. (2016). Extended Regression on Manifolds Estimation. In: Gammerman, A., Luo, Z., Vega, J., Vovk, V. (eds) Conformal and Probabilistic Prediction with Applications. COPA 2016. Lecture Notes in Computer Science(), vol 9653. Springer, Cham. https://doi.org/10.1007/978-3-319-33395-3_15

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-33395-3_15

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-33394-6

  • Online ISBN: 978-3-319-33395-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics