# A Distribution-Free Theory of Nonparametric Regression

Part of the Springer Series in Statistics book series (SSS)

Part of the Springer Series in Statistics book series (SSS)

The regression estimation problem has a long history. Already in 1632 Galileo Galilei used a procedure which can be interpreted as ?tting a linear relationship to contaminated observed data. Such ?tting of a line through a cloud of points is the classical linear regression problem. A solution of this problem is provided by the famous principle of least squares, which was discovered independently by A. M. Legendre and C. F. Gauss and published in 1805 and 1809, respectively. The principle of least squares can also be applied to construct nonparametric regression estimates, where one does not restrict the class of possible relationships, and will be one of the approaches studied in this book. Linear regression analysis, based on the concept of a regression function, was introduced by F. Galton in 1889, while a probabilistic approach in the context of multivariate normal distributions was already given by A. B- vais in 1846. The ?rst nonparametric regression estimate of local averaging type was proposed by J. W. Tukey in 1947. The partitioning regression - timate he introduced, by analogy to the classical partitioning (histogram) density estimate, can be regarded as a special least squares estimate.

Kernel Martingal neural networks probability probability theory

- DOI https://doi.org/10.1007/b97848
- Copyright Information Springer-Verlag New York 2002
- Publisher Name Springer, New York, NY
- eBook Packages Springer Book Archive
- Print ISBN 978-0-387-95441-7
- Online ISBN 978-0-387-22442-8
- Series Print ISSN 0172-7397
- About this book