Advertisement

Introduction

  • Calyampudi Radhakrishna Rao
  • Helge Toutenburg
Part of the Springer Series in Statistics book series (SSS)

Abstract

Linear models play a central part in modern statistical methods. On the one hand, these models are able to approximate a large amount of metric data structures in their entire range of definition or at least piecewise. On the other hand, approaches such as the analysis of variance, which model effects as linear deviations from a total mean, have proved their flexibility. The theory of generalized models enables us, through appropriate link functions, to apprehend error structures that deviate from the normal distribution and hence ensuring, that a linear model is maintained in principle. Numerous iterative procedures for solving the normal equations were developed especially for those cases where no explicit solution is possible. For the derivation of explicit solutions in rank-deficient linear models, classical procedures are available: for example, ridge or principal component regression, partial least squares, as well as the methodology of the generalized inverse. The problem of missing data in the variables can be dealt with by appropriate imputation procedures.

Keywords

Explicit Solution Generalize Inverse Principal Component Regression Loglinear Model Projection Pursuit 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Copyright information

© Springer Science+Business Media New York 1995

Authors and Affiliations

  • Calyampudi Radhakrishna Rao
    • 1
  • Helge Toutenburg
    • 2
  1. 1.Department of StatisticsThe Pennsylvania State UniversityUniversity ParkUSA
  2. 2.Institut für StatistikUniversität MünchenMünchenGermany

Personalised recommendations