D-MORPH regression: application to modeling with unknown parameters more than observation data
- 146 Downloads
Diffeomorphic modulation under observable response preserving homotopy (D-MORPH) is a model exploration method, originally developed for differential equations. We extend D-MORPH to regression treatment of a model described as a linear superposition of basis functions with unknown parameters being the expansion coefficients. The goal of D-MORPH regression is to improve prediction accuracy without sacrificing fitting accuracy. When there are more unknown parameters than observation data, the corresponding linear algebraic equation system is generally consistent, and has an infinite number of solutions exactly fitting the data. In this case, the solutions given by standard regression techniques can significantly deviate from the true system structure, and consequently provide large prediction errors for the model. D-MORPH regression is a practical systematic means to search over system structure within the infinite number of possible solutions while preserving fitting accuracy. An explicit expression is provided by D-MORPH regression relating the data to the expansion coefficients in the linear model. The expansion coefficients obtained by D-MORPH regression are particular linear combinations of those obtained by least-squares regression. The resultant prediction accuracy provided by D-MORPH regression is shown to be significantly improved in several model illustrations.
KeywordsD-MORPH Least-squares regression Regularization Ridge regression Smoothing splines Orthonormal polynomial
Unable to display preview. Download preview PDF.
- 2.Daniel B.L., Yen Y.F., Glover G.H. et al.: Radiology 209, 499–509 (1998)Google Scholar
- 3.E.J. Candés, J. Romberg, in Proceedings of SPIE International Symposium on Electro. Imaging 1, 76–86 (2005), San JoseGoogle Scholar
- 6.J. Kettenring, B. Lindsay, D. Siegmud (eds). Statistics: Challenges and opportunities for the twenty-first century. NSF report. Available at www.pnl.gov/scales/docs/nsf_report.pdf
- 9.Hastie T., Tibshirani R., Friedman J.: The elements of statistical learning: data mining, inference, and prediction. Springer, New York (2001)Google Scholar
- 10.Bishop C.M.: Pattern recognition and machine learning. Springer, New York (2007)Google Scholar
- 11.Tikhonov A.N.: Dokl. Akad. Nauk. SSSR 39, 195–198 (1943)Google Scholar
- 12.A.N. Tikhonov, Soviet Math. Dokl. 4, 1035–1038(1963). English translation of Dokl. Akad. Nauk. SSSR 151, 501–504 (1963)Google Scholar
- 13.Tikhonov A.N., Arsenin V.A.: Solution of Ill-posed Problems. Winston & Sons, Washington (1977) ISBN 0-470-99124-0Google Scholar
- 14.P.C. Hansen, Rank-deficient and Discrete ill-posed problems. (1998), SIAM.Google Scholar
- 15.Hoerl A.E.: Chem. Eng. Prog. 58, 54–59 (1962)Google Scholar
- 17.Wahba G.: Spline models for observational data. SIAM, Philadelphia (1990)Google Scholar
- 20.http://en.wikipedia.org/wiki/Tikhonov_regularization, Categories: Linear algebra, Estimation theory Views.
- 24.N. Danielson, V. Beltrani, J. Dominy, H. Rabitz, (manuscript in preparation)Google Scholar
- 25.Rao C.R., Mitra S.K.: Generalized inverse of matrix and its applications. Willey, New York (1971)Google Scholar
- 26.Matlab [7.0R14], 2004. MathWorks, IncGoogle Scholar
- 27.Bellman R.: Introduction to matrix analysis. 2nd edn, pp. 118. McGraw-hill, New York (1970)Google Scholar
- 28.Press W.H., Teukolsky S.A., Vetterling W.T., Flannery B.P.: Numerical recipes in FORTRAN—The art of science computing. 2nd edn, pp. 51. Cambridge university press, New York (1992)Google Scholar