Skip to main content

Paradox of choice and sharing personal information

Abstract

The purpose of this study is to investigate the relationship between a firm’s strategy and consumers’ decisions in the presence of the paradox of choice and sharing personal information. The paradox of choice implies that having too many choices does not necessarily ensure happiness and sometimes having less is more. A new model is constructed introducing a factor of information sharing into the model of a previous study that embedded the paradox of choice only (Kinjo and Ebina in AI Soc 30(2):291–297, 2015). A key feature of the model is its disutility function. It is demonstrated that if the sign of the cross derivative of the function is positive (negative) at the optimum, there is a positive (negative) correlation between the degree of sharing personal information chosen by the consumers and the number of products offered by the firm in its recommendation systems. It is also numerically indicated that the profit function of the firm becomes convex or concave depending on the shape of the disutility function. These results suggest that firms should carefully investigate the shape of the disutility function, under the paradox of choice and sharing personal information.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Data availability

None.

Code availability

None.

Notes

  1. 1.

    The phenomena have been referred to, variously, as the too-much-choice effect (Iyengar and Lepper 2000); choice overload (Chernev 2003); information overload (Van Zandt 2004); and hyperchoice; A large amount of similar research has been conducted. For example, research on information overload is scattered in the fields of computer and information science, marketing, law, psychology, and economics (Van Zandt 2004, p. 544). Hence, references are cited with respect to artificial intelligence, information systems, economics, and marketing to meet the purpose explained below.

  2. 2.

    Other fields are economics (Acquisti et al. 2016); management (Rust et al. 2002; Hann et al. 2008; Smith et al. 2011); psychology (Alge et al. 2006; Norberg and Horne 2007); law (Solove 2006); and information systems (Belanger and Crossler 2011; Pavlou 2011; Li 2012).

  3. 3.

    An e-commerce company such as Amazon offers advertisements in its recommendation systems (e.g., https://advertising.amazon.com/) (20/08/2021).

  4. 4.

    By formal approaches (e.g., mathematical and/or algorithmic approaches), many studies have addressed various related problems in AI (e.g. Ebina and Kinjo 2019; Naudé and Dimitri 2020), such as autonomous vehicles (e.g. Ebina and Kinjo 2021) and recommendation systems (e.g. Ikegami et al. 2020).

  5. 5.

    There are mainly two previous studies having similar motivations to this study, but not in the consideration of the paradox of choice. Focusing on consumers’ utility, Rust et al. (2002) considered a situation in which personal information is traded in a market among firms and consumers have to pay costs to protect their information. The study analyzes consumers’ and firms’ behavior. A point of commonality with this model is that consumers have an ideal point with respect to the degree of sharing personal information \({m}^{*}\in(\mathrm{0,1})\) in the setting explained in the next section). Unlike this study, consumers purchase personal information from firms. As a result, if firms can freely trade consumers’ personal information in the market, the level of consumers’ utility decreases.

    Using a game-theoretic approach, Hann et al. (2008) analyzed firms’ and consumers’ behavior under a setting in which two types of heterogeneous consumers determining concealment or reflection of their personal information existed and many firms competed in the market. The study states that previous analytical research has mostly ignored the harm that marketing imposes on consumers. Advertising and direct marketing (e.g., direct mail, telephone, and fax, and electronically) imposes inconvenience and other harms on consumers, which are introduced into the model. The previous study showed that if a consumer opens their information, they can get useful information about the product but also receive useless information or solicitation. Further, from the social viewpoint, solicitation presented by the firms is excessive to the social optimum.

  6. 6.

    We consider a one-dimensional model of consumer preference with respect to \(m\) for mathematical tractability. By using a generalized projection pursuit regression or a logistic regression, several potential dimensions (e.g., age, sex, address) can be projected into one variable (Lingjaerde and Liestøl 1998).

  7. 7.

    One can extend this simultaneous setting to sequential one: after consumers choose their degree of sharing personal information, the platform observes it and chooses the number of products to offer. Under sequential settings, the platform can access more information about consumers, for example, purchasing or searching history. We show that if the optimal values of m and n are independent, the fundamental property does not change in the new setting. Hence, the simultaneous setting is analyzed in the current study.

  8. 8.

    Several studies in marketing and advertising introduced a benefit/cost function of sharing consumers’ personal information (e.g. Hann et al. 2008).

  9. 9.

    We do not treat the privacy paradox, which assumes that the decision-making about personal information has irrational aspects (Kokolakis 2017). This phenomenon indicates that while people claim to be concerned about their privacy, they undertake little to protect their personal data in social media. By setting \(\tilde{m }>{m}^{*}\) in our model, one can easily derive equilibrium behaviors under the privacy paradox.

  10. 10.

    For example, see the following website: https://www.amazon.com/gp/help/customer/display.html?nodeId=GZD6R5LJWKAZDYHP (20/08/2021).

  11. 11.

    Note that the disutility function is called cost function in the previous study.

  12. 12.

    Some people might hesitate to disclose their personal information because it attracts their unintended information. We have also mentioned this point in Remark 2. Thanks to an anonymous referee for bringing our attention to this viewpoint.

  13. 13.

    Following the previous study, the positive (negative) cross derivative is defined as strategic complement (substitute). Therefore, this proposition states that if the disutility function is complemented (substitute) at the optimum, there is a positive (negative, respectively) correlation between the degree of sharing personal information and the number of products offered to the consumer.

  14. 14.

    Setting \(\left(a-P\right)=1\) and \(b=-1\) in Eq. (1) presented by Rust et al. (2002), the utility function without considering the calculation cost, that is \(c\left(n\right)=0\), corresponds to their setting. Rust et al. (2002) mainly focused on a price charged per unit of privacy, whereas the focus of this study is on the degree of sharing personal information and the number of products offered by a firm.

  15. 15.

    Note that the strategic complementarity also holds when \(a=0\), because \({X}_{12}\left({m}^{*},{n}^{*}\right)<0\) holds.

References

  1. Ackerman MS, Cranor LF, Reagle J (1999) Privacy in e-commerce: examining user scenarios and privacy preferences. In: Proceedings of the 1st ACM conference on electronic commerce, ACM, pp 1–8. https://doi.org/10.1145/336992.336995

  2. Acquisti A, Taylor C, Wagman L (2016) The economics of privacy. J Econ Lit 52(2):442–492. https://doi.org/10.1257/jel.54.2.442

    Article  Google Scholar 

  3. Alge BJ, Ballinger GA, Tangirala S, Oakley JL (2006) Information privacy in organizations: empowering creative and extrarole performance. J Econ Lit 91(1):221. https://doi.org/10.1037/0021-9010.91.1.221

    Article  Google Scholar 

  4. Allenby GM, Rossi PE (2006) Hierarchical Bayes models. The handbook of marketing research: uses, misuses, and future advances, pp 418–440

  5. Belanger F, Crossler RE (2011) Privacy in the digital age: a review of information privacy research in information systems. MIS Q. https://doi.org/10.2307/41409971

    Article  Google Scholar 

  6. Bollen D, Knijnenburg BP, Willemsen MC, Graus M (2010) Understanding choice overload in recommender systems. In: Proceedings of the fourth ACM conference on recommender systems, pp 63–70. https://doi.org/10.1007/s11257-016-9178-6

  7. Chernev A (2003) When more is less and less is more: the role of ideal point availability and assortment in consumer choice. J Consum Res 30(2):170–183. https://doi.org/10.1086/376808

    Article  Google Scholar 

  8. Distler V, Lallemand C, Koenig V (2020) How acceptable is this? How user experience factors can broaden our understanding of the acceptance of privacy trade-offs. Comput Hum Behav 106:106227. https://doi.org/10.1016/j.chb.2019.106227

    Article  Google Scholar 

  9. Dwork C (2006) Differential privacy. In: International colloquium on automata, languages, and programming. Springer, Berlin, pp 1–12. https://doi.org/10.1007/11787006_1

  10. Ebina T, Kinjo K (2019) Consumer confusion from price competition and excessive product attributes under the curse of dimensionality. AI Soc 34(3):615–624. https://doi.org/10.1007/s00146-017-0771-y

    Article  Google Scholar 

  11. Ebina T, Kinjo K (2021) Approaching the social dilemma of autonomous vehicles with a general social welfare function. Eng Appl Artif Intell 104:104390. https://doi.org/10.1016/j.engappai.2021.104390

    Article  Google Scholar 

  12. Hann IH, Hui KL, Lee SYT, Png IP (2008) Consumer privacy and marketing avoidance: a static model. Manag Sci 54(6):1094–1103. https://doi.org/10.1287/mnsc.1070.0837

    Article  MATH  Google Scholar 

  13. Huang W, Liu B, Tang H (2019) Privacy protection for recommendation system: a survey. J Phys Conf Ser 1325(1):012087. https://doi.org/10.1088/1742-6596/1325/1/012087/meta

    Article  Google Scholar 

  14. Ikegami K, Okumura K, Yoshikawa T (2020) A simple, fast, and safe mediator for congestion management. In: Proceedings of the AAAI conference on artificial intelligence, vol 34, no 02, pp 2030–2037. https://doi.org/10.1609/aaai.v34i02.5575

  15. Iyengar SS, Lepper MR (2000) When choice is demotivating: can one desire too much of a good thing? J Pers Soc Psychol 79(6):995–1006. https://doi.org/10.1037/0022-3514.79.6.995

    Article  Google Scholar 

  16. Jannach D, Jugovac M (2019) Measuring the business value of recommender systems. ACM Trans Manag Inf Syst TMIS 10(4):1–23. https://doi.org/10.1145/3370082

    Article  Google Scholar 

  17. Kinjo K, Ebina T (2015) Paradox of choice and consumer nonpurchase behavior. AI Soc 30(2):291–297. https://doi.org/10.1007/s00146-014-0546-7

    Article  Google Scholar 

  18. Kinjo K, Ebina T (2021) Applying the peak-end rule to decision-making regarding similar products: a case-based decision approach. Expert Syst. https://doi.org/10.1111/exsy.12763

    Article  Google Scholar 

  19. Kokolakis S (2017) Privacy attitudes and privacy behaviour: a review of current research on the privacy paradox phenomenon. Comput Secur 64:122–134. https://doi.org/10.1016/j.cose.2015.07.002

    Article  Google Scholar 

  20. Krasnova H, Günther O, Spiekermann S, Koroleva K (2009) Privacy concerns and identity in online social networks. Inf Soc 2(1):39–63. https://doi.org/10.1007/s12394-009-0019-1

    Article  Google Scholar 

  21. Ku YC, Peng CH, Yang YC (2014) Consumer preferences for the interface of e-commerce product recommendation system. In: Nah FFH (eds) HCI in business. HCIB 2014. Lect Notes Comput Sci. Springer, Cham, vol 8527, pp 526–537. https://doi.org/10.1007/978-3-319-07293-7_51

  22. Li Y (2012) Theories in online information privacy research: a critical review and an integrated framework. Decis Support Syst 54(1):471–481. https://doi.org/10.1016/j.dss.2012.06.010

    Article  Google Scholar 

  23. Liang TP, Lai HJ, Ku YC (2006) Personalized content recommendation and user satisfaction: theoretical synthesis and empirical findings. J Manag Inf Syst 23(3):45–70. https://doi.org/10.2753/MIS0742-1222230303

    Article  Google Scholar 

  24. Lingjaerde OC, Liestøl K (1998) Generalized projection pursuit regression. SIAM J Sci Comput 20(3):844–857. https://doi.org/10.1137/S1064827595296574

    MathSciNet  Article  MATH  Google Scholar 

  25. Lleras JS, Masatlioglu Y, Nakajima D, Ozbay EY (2017) When more is less: limited consideration. J Econ Theory 170:70–85. https://doi.org/10.1016/j.jet.2017.04.004

    MathSciNet  Article  MATH  Google Scholar 

  26. Martínez-López FJ, Rodríguez-Ardura I, Gázquez-Abad JC, Sánchez-Franco MJ, Cabal CC (2010) Psychological elements explaining the consumer’s adoption and use of a website recommendation system: a theoretical framework proposal. Internet Res. https://doi.org/10.1108/10662241011050731

    Article  Google Scholar 

  27. Nagar K, Gandotra P (2016) Exploring choice overload, internet shopping anxiety, variety seeking and online shopping adoption relationship: evidence from online fashion stores. Glob Bus Rev 17(4):851–869. https://doi.org/10.1177/0972150916645682

    Article  Google Scholar 

  28. Naous D, Legner C (2019) Understanding users’ preferences for privacy and security features–a conjoint analysis of cloud storage services. In: International conference on business information systems. Springer, Cham, pp 352–365. https://doi.org/10.1007/978-3-030-36691-9_30

  29. Naudé W, Dimitri N (2020) The race for an artificial general intelligence: implications for public policy. AI Soc 35(2):367–379. https://doi.org/10.1007/s00146-019-00887-x

    Article  Google Scholar 

  30. Nesvit K (2019) The computational approach for recommendation system based on tagging data. J Adv Math 16:8359–8367. https://doi.org/10.24297/jam.v16i0.8212

    Article  Google Scholar 

  31. Nikou S, Bouwman H, de Reuver M (2014) A consumer perspective on mobile service platforms: a conjoint analysis approach. Commun Assoc Inf Syst 34(1):82. https://doi.org/10.17705/1CAIS.03482

    Article  Google Scholar 

  32. Norberg PA, Horne DR (2007) Privacy attitudes and privacy-related behavior. Psychol Mark 24(10):829–847. https://doi.org/10.1002/mar.20186

    Article  Google Scholar 

  33. Ortoleva P (2013) The price of flexibility: towards a theory of thinking aversion. J Econ Theory 148(3):903–934. https://doi.org/10.1016/j.jet.2012.10.009

    MathSciNet  Article  MATH  Google Scholar 

  34. Oulasvirta A, Hukkinen JP, Schwartz B (2009) When more is less: the paradox of choice in search engine use. In: Proceedings of the 32nd international ACM SIGIR conference on research and development in information retrieval, ACM, pp 516–523. https://doi.org/10.1145/1571941.1572030

  35. Pavlou PA (2011) State of the information privacy literature: where are we now and where should we go? MIS Q. https://doi.org/10.2307/41409969

    Article  Google Scholar 

  36. Rust RT, Kannan PK, Peng N (2002) The customer economics of internet privacy. J Acad Mark Sci 30(4):455–464. https://doi.org/10.1177/009207002236917

    Article  Google Scholar 

  37. Saito Y, Aihara S, Matsutani M, Narita Y (2020) Open bandit dataset and pipeline: towards realistic and reproducible off-policy evaluation. arXiv preprint arXiv:2008.07146

  38. Scheibehenne B, Greifeneder R, Todd PM (2010) Can there ever be too many options? A meta-analytic review of choice overload. J Consum Res 37(3):409–425. https://doi.org/10.1086/651235

    Article  Google Scholar 

  39. Schwartz B (2004) The paradox of choice: why more is less. Ecco, New York

    Google Scholar 

  40. Shah AM, Wolford G (2007) Buying behavior as a function of parametric variation of number of choices. Psychol Sci 18(5):369–370. https://doi.org/10.1111/j.1467-9280.2007.01906.x

    Article  Google Scholar 

  41. Smith HJ, Dinev T, Xu H (2011) Information privacy research: an interdisciplinary review. MIS Q. https://doi.org/10.2307/41409970

    Article  Google Scholar 

  42. Solove DJ (2006) A taxonomy of privacy. Univ PA Law Rev 154(3):477–560. https://doi.org/10.2307/40041279

    Article  Google Scholar 

  43. Sweeney L (2002) k-anonymity: A model for protecting privacy. Int J Uncertain Fuzziness Knowl Based Syst 10(05):557–570. https://doi.org/10.1142/S0218488502001648

    MathSciNet  Article  MATH  Google Scholar 

  44. Valdez AC, Ziefle M (2019) The users’ perspective on the privacy-utility trade-offs in health recommender systems. Int J Hum Comput Stud 121:108–121. https://doi.org/10.1016/j.ijhcs.2018.04.003

    Article  Google Scholar 

  45. Van Zandt T (2004) Information overload in a network of targeted communication. Rand J Econ 35(3):542–560. https://doi.org/10.2307/1593707

    Article  Google Scholar 

  46. Vesanen J (2007) What is personalization? A CONCEPTUAL FRAMEWORK. Eur J Mark 41(5/6):409–418. https://doi.org/10.1108/03090560710737534

    Article  Google Scholar 

  47. Wottrich VM, van Reijmersdal EA, Smit EG (2018) The privacy trade-off for mobile app downloads: the roles of app value, intrusiveness, and privacy concerns. Decis Support Syst 106:44–52. https://doi.org/10.1016/j.dss.2017.12.003

    Article  Google Scholar 

  48. Yin C, Shi L, Sun R, Wang J (2020) Improved collaborative filtering recommendation algorithm based on differential privacy protection. J Supercomput 76(7):5161–5174. https://doi.org/10.1007/s11227-019-02751-7

    Article  Google Scholar 

Download references

Acknowledgements

Ebina acknowledges the financial support from JSPS KAKENHI Grant-in-Aid for Scientific Research (C) JP18K01627 and (C) JP21K01468. Kinjo acknowledges financial support from JSPS KAKENHI Grant-in-Aid for Scientific Research (C) JP20K02004.

Funding

The first author acknowledges the financial support from JSPS KAKENHI Grant-in-Aid for Scientific Research (C) JP18K01627 and (C) JP21K01468. The second author acknowledges financial support from JSPS KAKENHI Grant-in-Aid for Scientific Research (C) JP20K02004.

Author information

Affiliations

Authors

Contributions

The authorship of the paper is as follows: TE (corresponding author): methodology, formal analysis, writing original draft, project administration, writing review and editing, visualization, and funding acquisition. KK: conceptualization, formal analysis, administration, funding acquisition.

Corresponding author

Correspondence to Takeshi Ebina.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Ebina, T., Kinjo, K. Paradox of choice and sharing personal information. AI & Soc (2021). https://doi.org/10.1007/s00146-021-01291-0

Download citation

Keywords

  • Paradox of choice
  • Choice overload
  • Burden of information
  • Information sharing