Abstract
In the analysis of experiments, there are many variable selection algorithms for linear models. Most of these approaches select the best model based on some criteria such as AIC. These criteria do not allow for any relationship between predictors. However, in practice, the analysis is driven by following three principles: Effect Hierarchy, Effect Sparsity, and Effect Heredity Principle. The approach depending solely on those criteria ignore these principles, so it would often select a hard to interpretable models, for instance, which are consisted with only interaction terms. In this article, we extend the LASSO method to identify significant interaction terms mainly focusing on the heredity principle. And we compare the proposed method with ordinary LASSO and traditional variable selection approach. In the example, we analyze the data obtained from designed experiments such as Placket-Burman design and supersaturated design.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Breiman, L. (1995). Better subset regression using the non-negative garrote. Technometrics, 37, 373–384.
Chipman, H., Hamada, M., & Wu, C. F. J. (1997). A Bayesian variable-selection approach for analyzing designed experiments with complex aliasing. Technometrics, 39, 372–381.
Efron, B., Hastie, T., Johnstone, I., & Tibshirani, R. (2004). Least angle regressionh (with discussion). The Annals of Statistics, 32(2), 407–499.
Hamada, M., & Wu, C. F. J. (1992). Analysis of designed experiments with complex aliasing. Journal of Quality Technology, 24, 130–137.
Lin, D. K. J. (1993). A new class of supersaturated designs. Technometrics, 35, 28–31.
Nam, H. C., William, L., & Ji. Z. (2010). Variable selection with the strong heredity constraint and its oracle property. Journal of the American Statistical Association, 105, 354–364.
Nelder, J. A. (1998). The selection of terms in response-surface models: How strong is the weak-heredity principle? The American Statistician, 52, 315–318.
Tibshirani, R. (1996). Regression shrinkage and selection via the Lasso. Journal of the Royal Statistical Society Series B, 58, 267–288.
Wu, C. F. J., & Hamada, M. (2000). Experiments: Planning, analysis, and parameter design optimization. New York: Wiley.
Yuan, M., Joseph, V., & Lin, Y. (2007). An efficient variable selection approach for analyzing designed experiments. Technometrics, 49(4), 430–439.
Zhang, H., & Lu, W. (2007). Adaptive LASSO for Cox’s proportional hazard model. Biometrika, 94, 691–703.
Zou, H. (2006). The adaptive Lasso and its oracle properties. Journal of the American Statistical Association, 101(476), 1418–1429.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Noguchi, H., Ojima, Y., Yasui, S. (2012). A Practical Variable Selection for Linear Models. In: Lenz, HJ., Schmid, W., Wilrich, PT. (eds) Frontiers in Statistical Quality Control 10. Frontiers in Statistical Quality Control, vol 10. Physica, Heidelberg. https://doi.org/10.1007/978-3-7908-2846-7_23
Download citation
DOI: https://doi.org/10.1007/978-3-7908-2846-7_23
Published:
Publisher Name: Physica, Heidelberg
Print ISBN: 978-3-7908-2845-0
Online ISBN: 978-3-7908-2846-7
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)