- 5.8k Downloads
In Chapters 6 and 7 we have concentrated on finding groups in data, or, given a grouping, creating a predictive model for new data. The last situation is “supervised”in the sense that we use a set of examples with known class labels, the training set, to build the model. In this chapter we will do something similar – now we are not predicting a discrete class property but rather a continuous variable. Put difierently: given a set of independent real-valued variables (matrix X), we want to build a model that allows prediction of Y , consisting of one, or possibly more, real-valued dependent variables. As in almost all regression cases, we here assume that errors, normally distributed with constant variance, are only present in the dependent variables, or at least are so much larger in the dependent variables that errors in the independent variables can be ignored. Of course, we also would like to have an estimate of the expected error in predictions for future data.
KeywordsPartial Little Square Partial Little Square Regression Ridge Regression Partial Little Square Model Principal Component Regression
Unable to display preview. Download preview PDF.