Advertisement

Advances in Data Analysis and Classification

, Volume 13, Issue 3, pp 621–639 | Cite as

Investigating consumers’ store-choice behavior via hierarchical variable selection

  • Toshiki Sato
  • Yuichi TakanoEmail author
  • Takanobu Nakahara
Regular Article

Abstract

This paper is concerned with a store-choice model for investigating consumers’ store-choice behavior based on scanner panel data. Our store-choice model enables us to evaluate the effects of the consumer/product attributes not only on the consumer’s store choice but also on his/her purchase quantity. Moreover, we adopt a mixed-integer optimization (MIO) approach to selecting the best set of explanatory variables with which to construct the store-choice model. We devise two MIO models for hierarchical variable selection in which the hierarchical structure of product categories is used to enhance the reliability and computational efficiency of the variable selection. We assess the effectiveness of our MIO models through computational experiments on actual scanner panel data. These experiments are focused on the consumer’s choice among three types of stores in Japan: convenience stores, drugstores, and (grocery) supermarkets. The computational results demonstrate that our method has several advantages over the common methods for variable selection, namely, the stepwise method and \(L_1\)-regularized regression. Furthermore, our analysis reveals that convenience stores are most strongly chosen for gift cards and garbage disposal permits, drugstores are most strongly chosen for products that are specific to drugstores, and supermarkets are most strongly chosen for health food products by women with families.

Keywords

Store choice Variable selection Mixed-integer optimization Multiple regression analysis Scanner panel data 

Mathematics Subject Classification

62-07 Data analysis 

Notes

Acknowledgements

This work was partially supported by JSPS KAKENHI Grant Numbers JP15K17146, JP17K12983 and a Grant-in-Aid of Joint Research from the Institute of Information Science, Senshu University.

References

  1. Arthanari TS, Dodge Y (1981) Mathematical programming in statistics. Wiley, New YorkzbMATHGoogle Scholar
  2. Bach F (2008) Exploring large feature spaces with hierarchical multiple kernel learning. In: Proceedings of the 21st international conference on neural information processing systems, pp 105–112Google Scholar
  3. Baker J, Parasuraman A, Grewal D, Voss GB (2002) The influence of multiple store environment cues on perceived merchandise value and patronage intentions. J Mark 66:120–141CrossRefGoogle Scholar
  4. Bertsimas D, King A (2016) An algorithmic approach to linear regression. Oper Res 64:2–16MathSciNetCrossRefzbMATHGoogle Scholar
  5. Bertsimas D, King A (2017) Logistic regression: from art to science. Stat Sci 32:367–384MathSciNetCrossRefzbMATHGoogle Scholar
  6. Bertsimas D, King A, Mazumder R (2016) Best subset selection via a modern optimization lens. Ann Stat 44:813–852MathSciNetCrossRefzbMATHGoogle Scholar
  7. Bien J, Taylor J, Tibshirani R (2013) A lasso for hierarchical interactions. Ann Stat 41:1111–1141MathSciNetCrossRefzbMATHGoogle Scholar
  8. Bloemer J, de Ruyter K (1998) On the relationship between store image, store satisfaction and store loyalty. Eur J Mark 32:499–513CrossRefGoogle Scholar
  9. Blum AL, Langley P (1997) Selection of relevant features and examples in machine learning. Artif Intell 97:245–271MathSciNetCrossRefzbMATHGoogle Scholar
  10. Briesch RA, Chintagunta PK, Fox EJ (2009) How does assortment affect grocery store choice? J Mark Res 46:176–189CrossRefGoogle Scholar
  11. Chernev A (2006) Decision focus and consumer choice among assortments. J Consum Res 33:50–59CrossRefGoogle Scholar
  12. Efroymson MA (1960) Multiple regression analysis. Math Methods Digit Comput 1:191–203MathSciNetGoogle Scholar
  13. Furnival GM, Wilson RW (2000) Regressions by leaps and bounds. Technometrics 42:69–79CrossRefzbMATHGoogle Scholar
  14. Guyon I, Elisseeff A (2003) An introduction to variable and feature selection. J Mach Learn Res 3:1157–1182zbMATHGoogle Scholar
  15. Huang J, Zhang T, Metaxas D (2011) Learning with structured sparsity. J Mach Learn Res 12:3371–3412MathSciNetzbMATHGoogle Scholar
  16. Jacob L, Obozinski G, Vert JP (2009) Group lasso with overlap and graph lasso. In: Proceedings of the 26th international conference on machine learning, pp 433–440Google Scholar
  17. Jenatton R, Audibert JY, Bach F (2011a) Structured variable selection with sparsity-inducing norms. J Mach Learn Res 12:2777–2824MathSciNetzbMATHGoogle Scholar
  18. Jenatton R, Mairal J, Obozinski G, Bach F (2011b) Proximal methods for hierarchical sparse coding. J Mach Learn Res 12:2297–2334MathSciNetzbMATHGoogle Scholar
  19. Kahn BE, Lehmann DR (1991) Modeling choice among assortments. J Retail 67:274–299Google Scholar
  20. Kim S, Xing EP (2010) Tree-guided group lasso for multi-task regression with structured sparsity. In: Proceedings of the 27th international conference on machine learning, pp 543–550Google Scholar
  21. Kohavi R, John GH (1997) Wrappers for feature subset selection. Artif Intell 97:273–324CrossRefzbMATHGoogle Scholar
  22. Konno H, Yamamoto R (2009) Choosing the best set of variables in regression analysis using integer programming. J Glob Optim 44:273–282MathSciNetCrossRefzbMATHGoogle Scholar
  23. Leszczyc PTP, Timmermans H (2002) Experimental choice analysis of shopping strategies. J Retail 77:493–509CrossRefGoogle Scholar
  24. Liu H, Motoda H (eds) (2007) Computational methods of feature selection. Chapman & Hall, Boca RatonzbMATHGoogle Scholar
  25. McFadden D (1986) The choice theory approach to market research. Mark Sci 5:275–297CrossRefGoogle Scholar
  26. Maldonado S, Pérez J, Weber R, Labbé M (2014) Feature selection for support vector machines via mixed integer linear programming. Inf Sci 279:163–175MathSciNetCrossRefzbMATHGoogle Scholar
  27. Miyashiro R, Takano Y (2015a) Subset selection by Mallows’ $C_p$: a mixed integer programming approach. Expert Syst Appl 42:325–331CrossRefGoogle Scholar
  28. Miyashiro R, Takano Y (2015b) Mixed integer second-order cone programming formulations for variable selection in linear regression. Eur J Oper Res 247:721–731MathSciNetCrossRefzbMATHGoogle Scholar
  29. Pan Y, Zinkhan GM (2006) Determinants of retail patronage: a meta-analytical perspective. J Retail 82:229–243CrossRefGoogle Scholar
  30. Reutterer T, Teller C (2009) Store format choice and shopping trip types. Int J Retail Distrib Manag 37:695–710CrossRefGoogle Scholar
  31. Sato T, Takano Y, Miyashiro R, Yoshise A (2016a) Feature subset selection for logistic regression via mixed integer optimization. Comput Optim Appl 64:865–880MathSciNetCrossRefzbMATHGoogle Scholar
  32. Sato T, Takano Y, Nakahara T (2016b) Using mixed integer optimisation to select variables for a store choice model. Int J Knowl Eng Soft Data Paradig 5:123–134CrossRefGoogle Scholar
  33. Sato T, Takano Y, Miyashiro R (2017) Piecewise-linear approximation for feature subset selection in a sequential logit model. J Oper Res Soc Jpn 60:1–14MathSciNetCrossRefzbMATHGoogle Scholar
  34. Tamura R, Kobayashi K, Takano Y, Miyashiro R, Nakata K, Matsui T (2016) Mixed integer quadratic optimization formulations for eliminating multicollinearity based on variance inflation factor. Optimization Online. http://www.optimization-online.org/DB_HTML/2016/09/5655.html
  35. Tamura R, Kobayashi K, Takano Y, Miyashiro R, Nakata K, Matsui T (2017) Best subset selection for eliminating multicollinearity. J Oper Res Soc Jpn 60:321–336MathSciNetCrossRefzbMATHGoogle Scholar
  36. Tibshirani R (1996) Regression shrinkage and selection via the lasso. J R Stat Soc B58:267–288MathSciNetzbMATHGoogle Scholar
  37. Tibshirani R, Saunders M, Rosset S, Zhu J, Knight K (2005) Sparsity and smoothness via the fused lasso. J R Stat Soc B67:91–108MathSciNetCrossRefzbMATHGoogle Scholar
  38. Tversky A, Sattath S (1979) Preference trees. Psychol Rev 86:542–573CrossRefGoogle Scholar
  39. Ustun B, Rudin C (2016) Supersparse linear integer models for optimized medical scoring systems. Mach Learn 102:349–391MathSciNetCrossRefzbMATHGoogle Scholar
  40. Wilson ZT, Sahinidis NV (2017) The ALAMO approach to machine learning. Comput Chem Eng 106:785–795CrossRefGoogle Scholar
  41. Yuan M, Lin Y (2006) Model selection and estimation in regression with grouped variables. J R Stat Soc B68:49–67MathSciNetCrossRefzbMATHGoogle Scholar
  42. Yusta SC (2009) Different metaheuristic strategies to solve the feature selection problem. Pattern Recognit Lett 30:525–534CrossRefGoogle Scholar
  43. Zhao P, Rocha G, Yu B (2009) The composite absolute penalties family for grouped and hierarchical variable selection. Ann Stat 37:3468–3497MathSciNetCrossRefzbMATHGoogle Scholar

Copyright information

© Springer-Verlag GmbH Germany, part of Springer Nature 2018

Authors and Affiliations

  • Toshiki Sato
    • 1
  • Yuichi Takano
    • 2
    Email author
  • Takanobu Nakahara
    • 3
  1. 1.Graduate School of Systems and Information EngineeringUniversity of TsukubaTsukuba-shiJapan
  2. 2.Faculty of Engineering, Information and SystemsUniversity of TsukubaTsukuba-shiJapan
  3. 3.School of CommerceSenshu UniversityKawasaki-shiJapan

Personalised recommendations