Multivariate Regression

  • Ron WehrensEmail author
Part of the Use R book series (USE R)


In Chapters 6 and 7 we have concentrated on finding groups in data, or, given a grouping, creating a predictive model for new data. The last situation is “supervised”in the sense that we use a set of examples with known class labels, the training set, to build the model. In this chapter we will do something similar – now we are not predicting a discrete class property but rather a continuous variable. Put difierently: given a set of independent real-valued variables (matrix X), we want to build a model that allows prediction of Y , consisting of one, or possibly more, real-valued dependent variables. As in almost all regression cases, we here assume that errors, normally distributed with constant variance, are only present in the dependent variables, or at least are so much larger in the dependent variables that errors in the independent variables can be ignored. Of course, we also would like to have an estimate of the expected error in predictions for future data.


Partial Little Square Partial Little Square Regression Ridge Regression Partial Little Square Model Principal Component Regression 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  1. 1.Research and Innovation CentreFondazione Edmund MachSan Michele all’AdigeItaly

Personalised recommendations