Lasso–type and Heuristic Strategies in Model Selection and Forecasting

  • Ivan Savin
  • Peter Winker
Part of the Studies in Fuzziness and Soft Computing book series (STUDFUZZ, volume 285)


Several approaches for subset recovery and improved forecasting accuracy have been proposed and studied. One way is to apply a regularization strategy and solve the model selection task as a continuous optimization problem. One of the most popular approaches in this research field is given by Lasso–type methods. An alternative approach is based on information criteria. In contrast to the Lasso, these methods also work well in the case of highly correlated predictors. However, this performance can be impaired by the only asymptotic consistency of the information criteria. The resulting discrete optimization problems exhibit a high computational complexity. Therefore, a heuristic optimization approach (Genetic Algorithm) is applied. The two strategies are compared by means of a Monte–Carlo simulation study together with an empirical application to leading business cycle indicators in Russia and Germany.


False Negative Rate True Positive Rate Heuristic Strategy Adaptive Lasso Leading Indicator 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Bai, J., Ng, S.: Forecasting economic time series using targeted predictors. Journal of Econometrics 146(2), 304–317 (2008)MathSciNetCrossRefGoogle Scholar
  2. 2.
    Candès, E.J., Tao, T.: The Dantzig selector: statistical estimation when p is much larger than n. Annals of Statistics 35(6), 2313–2351 (2007)MathSciNetzbMATHCrossRefGoogle Scholar
  3. 3.
    Efron, B., Hastie, T., Johnstone, I., Tibshirani, R.: Least angle regression. Annals of Statistics 32, 407–489 (2004)MathSciNetzbMATHCrossRefGoogle Scholar
  4. 4.
    Frank, I.E., Friedman, J.H.: A statistical view of some chemometrics regression tools. Technometrics 35(2), 109–135 (1993)zbMATHCrossRefGoogle Scholar
  5. 5.
    Gasso, G., Rakotomamonjy, A., Canu, S.: Recovering sparse signals with a certain family of non-convex penalties and DC programming. IEEE Trans. on Signal Processing 57(12), 4686–4698 (2009)MathSciNetCrossRefGoogle Scholar
  6. 6.
    Gilli, M., Winker, P.: Heuristic optimization methods in econometrics. In: Belsley, D., Kontoghiorghes, E. (eds.) Handbook of Computational Econometrics, pp. 81–119. Wiley, Chichester (2009)CrossRefGoogle Scholar
  7. 7.
    Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer, New York (2009)zbMATHGoogle Scholar
  8. 8.
    Hendry, D.F., Krolzig, H.M.: The properties of automatic ’GETS’ modelling. The Economic Journal 115(502), C32–C61 (2005)CrossRefGoogle Scholar
  9. 9.
    Kapetanios, G., Labhard, V., Price, S.: Forecasting using Bayesian and information-theoretic model averaging: An application to U.K. inflation. Journal of Business & Economic Statistics 26(1), 33–41 (2008)MathSciNetCrossRefGoogle Scholar
  10. 10.
    Meinshausen, N., Yu, B.: Lasso-type recovery of sparse representations for high-dimensional data. Annals of Statistics 37(1), 246–270 (2008)MathSciNetCrossRefGoogle Scholar
  11. 11.
    Ozyldrim, A., Schaitkin, B., Zarnowitz, V.: Business cycles in the Euro area defined with coincident economic indicators and predicted with leading economic indicators. Journal of Forecasting 29(1–2), 6–28 (2010)MathSciNetCrossRefGoogle Scholar
  12. 12.
    Perez-Amaral, T., Gallo, G.M., White, H.: A flexible tool for model building: The relevant transformation of the inputs network approach (RETINA). Oxford Bulletin of Economics and Statistics 65(1), 821–838 (2003)CrossRefGoogle Scholar
  13. 13.
    Savin, I.: A comparative study of the lasso–type and heuristic model selection methods. COMISEF Working Paper Series 42 (2010)Google Scholar
  14. 14.
    Savin, I., Winker, P.: Heuristic optimization methods for dynamic panel data model selection. Application on the Russian innovative performance. Computational Economics (forthcoming)Google Scholar
  15. 15.
    Savin, I., Winker, P.: Heuristic model selection for leading indicators in Russia and Germany. MAGKS Joint Discussion Paper Series in Economics (January 2011)Google Scholar
  16. 16.
    Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society B 58(1), 267–288 (1996)MathSciNetzbMATHGoogle Scholar
  17. 17.
    Vogt, G.: The forecasting performance of ifo-indicators under realtime conditions. Journal of Economics and Statistics 227(1), 87–101 (2007)Google Scholar
  18. 18.
    Winker, P.: Identification of multivariate AR-models by threshold accepting. Computational Statistics & Data Analysis 20(3), 295–307 (1995)MathSciNetzbMATHCrossRefGoogle Scholar
  19. 19.
    Zhao, P., Yu, B.: On model selection consistency of lasso. Journal of Machine Learning Research 7, 2541–2563 (2006)MathSciNetzbMATHGoogle Scholar
  20. 20.
    Zou, H.: The adaptive lasso and its oracle properties. Journal of the American Statistical Association 101, 1418–1429 (2006)MathSciNetzbMATHCrossRefGoogle Scholar
  21. 21.
    Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society B 67(2), 301–320 (2005)MathSciNetzbMATHCrossRefGoogle Scholar

Copyright information

© Springer-Verlag GmbH Berlin Heidelberg 2013

Authors and Affiliations

  1. 1.DFG Research Training Program ‘The Economics of Innovative Change’Friedrich Schiller University Jena and Max Planck Institute of EconomicsJenaGermany
  2. 2.Justus Liebig University GiessenGiessenGermany
  3. 3.Centre for European Economic ResearchMannheimGermany

Personalised recommendations