Advertisement

Opening the ‘Black Box’: An Overview of Methods to Investigate the Decision-Making Process in Choice-Based Surveys

  • Dan RigbyEmail author
  • Caroline Vass
  • Katherine Payne
Review Article

Abstract

The desire to understand the preferences of patients, healthcare professionals and the public continues to grow. Health valuation studies, often in the form of discrete choice experiments, a choice based survey approach, proliferate as a result. A variety of methods of pre-choice process analysis have been developed to investigate how and why people make their decisions in such experiments and surveys. These techniques have been developed to investigate how people acquire and process information and make choices. These techniques offer the potential to test and improve theories of choice and/or associated empirical models. This paper provides an overview of such methods, with the focus on their use in stated choice-based healthcare studies. The methods reviewed are eye tracking, mouse tracing, brain imaging, deliberation time analysis and think aloud. For each method, we summarise the rationale, implementation, type of results generated and associated challenges, along with a discussion of possible future developments.

Notes

Author contributions

The nature and scope of the paper was developed by DR, CV and KP. CV led the search and drafting process for eye-tracking, mouse tracing and think-aloud sections; DR led this work for the deliberation time section; and KP and DR led this work for the brain imaging section. All authors provided critical review of the draft of the final manuscript.

Compliance with Ethical Standards

Funding

Caroline M. Vass and Katherine Payne were supported in the preparation and submission of this article by Mind the Risk international network collaboration funded by the Swedish Foundation for Humanities and Social Sciences. The views and opinions expressed are those of the authors, and not necessarily those of other Mind the Risk members or the Swedish Foundation for Humanities and Social Sciences.

Conflicts of interest

Dan Rigby, Caroline Vass and Katherine Payne have no conflicts of interest that are relevant to the content of this article.

References

  1. 1.
    de Bekker-Grob EW, Ryan M, Gerard K. Discrete choice experiments in health economics: a review of the literature. Health Econ. 2012;21:145–72.PubMedCrossRefPubMedCentralGoogle Scholar
  2. 2.
    Clark M, Determann D, Petrou S, et al. Discrete choice experiments in health economics: a review of the literature. Pharmacoeconomics. 2014;32:883–902.PubMedCrossRefPubMedCentralGoogle Scholar
  3. 3.
    Schulte-Mecklenbeck M, Johnson JG, Böckenholt U, et al. Process-tracing methods in decision making: on growing up in the 70s. Curr Dir Psychol Sci. 2017;26:442–50.CrossRefGoogle Scholar
  4. 4.
    Schlosser RW, Wendt O, Bhavnani S, et al. Use of information-seeking strategies for developing systematic reviews and engaging in evidence-based practice: the application of traditional and comprehensive Pearl Growing. A review. Int J Lang Commun Disord. 2006;41:567–82.PubMedCrossRefPubMedCentralGoogle Scholar
  5. 5.
    Hinde S, Spackman E. Bidirectional citation searching to completion: an exploration of literature searching methods. Pharmacoeconomics. 2014;33:5–11.CrossRefGoogle Scholar
  6. 6.
    Louviere JJ, Flynn TN, Marley AAJ. Best–worst scaling: theory, methods and applications. Cambridge: Cambridge University Press; 2015.CrossRefGoogle Scholar
  7. 7.
    Bialkova S, van Trijp HCM. An efficient methodology for assessing attention to and effect of nutrition information displayed front-of-pack. Food Qual Prefer. 2011;22:592–601.CrossRefGoogle Scholar
  8. 8.
    Duchowski AT. A breadth-first survey of eye-tracking applications. Behav Res Methods Instruments Comput. 2002;34:455–70.CrossRefGoogle Scholar
  9. 9.
    Kowler E, Anderson E, Dosher B, et al. The role of attention in the programming of saccades. Vision Res. 1995;35:1897–916.PubMedCrossRefPubMedCentralGoogle Scholar
  10. 10.
    van Beers RJ. The sources of variability in saccadic eye movements. J Neurosci. 2007;27:8757–70.PubMedPubMedCentralCrossRefGoogle Scholar
  11. 11.
    Just M, Carpenter P. A theory of reading: from eye fixations to comprehension. Psychol Rev. 1980;87:329–54.PubMedCrossRefPubMedCentralGoogle Scholar
  12. 12.
    Orquin JL, Mueller Loose S. Attention and choice: a review on eye movements in decision making. Acta Psychol (Amst). 2013;144:190–206.CrossRefGoogle Scholar
  13. 13.
    Holmqvist K, Nyström M, Andersson R, et al. Eye tracking: a comprehensive guide to methods and measures. Oxford: Oxford University Press; 2011.Google Scholar
  14. 14.
    Raney GE, Campbell SJ, Bovee JC. Using eye movements to evaluate the cognitive processes involved in text comprehension. J Vis Exp. 2014;83:1–7.Google Scholar
  15. 15.
    Rayner K. Visual attention in reading: eye movements. Mem Cognit. 1977;5:443–8.PubMedCrossRefPubMedCentralGoogle Scholar
  16. 16.
    Krucien N, Ryan M, Hermens F. Visual attention in multi-attributes choices: what can eye-tracking tell us? J Econ Behav Organ. 2017;135:251–67.CrossRefGoogle Scholar
  17. 17.
    Ryan M, Krucien N, Hermens F. The eyes have it: using eye tracking to inform information processing strategies in multi-attributes choices. Health Econ. 2018;27:709–21.PubMedCrossRefPubMedCentralGoogle Scholar
  18. 18.
    Spinks J, Mortimer D. Lost in the crowd? Using eye-tracking to investigate the effect of complexity on attribute non-attendance in discrete choice experiments. BMC Med Inform Decis Mak. 2016;16:14.PubMedPubMedCentralCrossRefGoogle Scholar
  19. 19.
    Vass C, Rigby D, Tate K, et al. An exploratory application of eye-tracking methods in a discrete choice experiment. Med Decis Mak. 2018;38:658–72.CrossRefGoogle Scholar
  20. 20.
    Chavez D, Palma M, Collart A. Eye tracking to model attribute attendance. San Antonio: Southern Agricultural Economics Association; 2016.Google Scholar
  21. 21.
    Chen Y, Caputo V, Nayga RM, et al. How visual attention affects choice outcomes: an eyetracking study. In: 3rd International Winter Conference on Brain–Computer Interface, BCI 2015; 2015.Google Scholar
  22. 22.
    Erdem S, McCarthy J. The effect of front-of-pack nutrition labelling formats on consumers’ food choices and decision-making: merging discrete choice experiment with an eye tracking experiment. Boston: Agricultural and Applied Economics Association; 2016.Google Scholar
  23. 23.
    Van Loo EJ, Caputo V, Nayga RM, et al. Sustainability labels on coffee: consumer preferences, willingness-to-pay and visual attention to attributes. Ecol Econ. 2015;118:215–25.CrossRefGoogle Scholar
  24. 24.
    Balcombe K, Fraser I, McSorley E. Visual attention and attribute attendance in multi-attribute choice experiments. J Appl Econom. 2014;30:1–27.Google Scholar
  25. 25.
    Grebitus C, Seitz C. Relationship between attention and choice. Naples: European Association of Agricultural Economists; 2014. p. 1–13.Google Scholar
  26. 26.
    Uggeldahl K, Jacobsen C, Lundhede TH, et al. Choice certainty in discrete choice experiments: will eye tracking provide useful measures? J Choice Model. 2016;20:35–48.CrossRefGoogle Scholar
  27. 27.
    Meißner M, Musalem A, Huber J. Eye tracking reveals processes that enable conjoint choices to become increasingly efficient with practice. J Mark Res. 2016;53:1–17.CrossRefGoogle Scholar
  28. 28.
    Oviedo JL, Caparrós A. Information and visual attention in contingent valuation and choice modeling: field and eye-tracking experiments applied to reforestations in Spain. J For Econ. 2015;21:185–204.Google Scholar
  29. 29.
    Rihn A, Khachatryan H, Campbell B, et al. Consumer preferences for organic production methods and origin promotions on ornamental plants: evidence from eye-tracking experiments. Agric Econ. 2016;47:599–608.CrossRefGoogle Scholar
  30. 30.
    Khushaba RN, Wise C, Kodagoda S, et al. Consumer neuroscience: assessing the brain response to marketing stimuli using electroencephalogram (EEG) and eye tracking. Expert Syst Appl. 2013;40:3803–12.CrossRefGoogle Scholar
  31. 31.
    Lancaster K. A new approach to consumer theory author. J Polit Econ. 1966;74:132–57.CrossRefGoogle Scholar
  32. 32.
    Arieli A, Ben-Ami Y, Rubinstein A. Fairness motivations and procedures of choice between lotteries as revealed through eye movements. Foerder Institute for Economic Research Working Papers 275720; 2009.Google Scholar
  33. 33.
    Duchowski A. Eye tracking methodology: theory and practice. 2nd ed. New York: Springer; 2007.Google Scholar
  34. 34.
    Orquin JL, Ashby NJS, Clarke ADF. Areas of interest as a signal detection problem in behavioral eye-tracking research. J Behav Decis Mak. 2016;29:103–15.CrossRefGoogle Scholar
  35. 35.
    Horwitz R, Kreuter F, Conrad F. Using mouse movements to predict web survey response difficulty. Soc Sci Comput Rev. 2017;35:388–405.CrossRefGoogle Scholar
  36. 36.
    MouseFlow https://mouseflow.com/. Accessed 17 Aug 2018.
  37. 37.
    MouseTracker http://www.mousetracker.org/.Accessed 11 Jun 2017.
  38. 38.
    Franco-Watkins A, Johnson J. Applying the decision moving window to risky choice: comparison of eye-tracking and mousetracing methods. Judgm Decis Mak. 2011;6:740–9.Google Scholar
  39. 39.
    Gray E. Time preference for future health events. PhD Thesis, HERU, University of Aberdeen; 2012.Google Scholar
  40. 40.
    Soekhai V, de Bekker-Grob EW, Ellis AR, et al. Discrete choice experiments in health economics: past, present and future. PharmacoEconomics. 2019;37:201–26.PubMedCrossRefPubMedCentralGoogle Scholar
  41. 41.
    Braeutigam S. Magnetoencephalography: fundamentals and established and emerging clinical applications in radiology. ISRN Radiol. 2013;12:529463.Google Scholar
  42. 42.
    Papanicolaou AC. Clinical Magnetoencephalography and magnetic source imaging. Cambridge: Cambridge University Press; 2009.CrossRefGoogle Scholar
  43. 43.
    Vecchiato G, Astolfi L, De Vico Fallani F, et al. On the use of EEG or MEG brain imaging tools in neuromarketing research. Comput Intell Neurosci. 2011;2011:643489.Google Scholar
  44. 44.
    Camerer C, Loewenstein G, Prelec D. Neuroeconomics: how neuroscience can inform economics. J Econ Lit. 2005;43:9–64.CrossRefGoogle Scholar
  45. 45.
    Upright MRI http://www.uprightmri.co.uk/. Accessed 7 Jun 2017.
  46. 46.
    Hedgcock WM, Crowe DA, Leuthold AC, et al. A magnetoencephalography study of choice bias. Exp Brain Res. 2010;202:121–7.PubMedCrossRefPubMedCentralGoogle Scholar
  47. 47.
    Huber J, Payne JW, Puto C. Adding asymmetrically dominated alternatives: violations of regularity and the similarity hypothesis. J Consum Res. 1982;9:90.CrossRefGoogle Scholar
  48. 48.
    Khushaba RN, Kodagoda S, Dissanayake G, et al. A neuroscientific approach to choice modeling: electroencephalogram (EEG) and user preferences. In: Proceedings of the international joint conference on neural networks. 2012.Google Scholar
  49. 49.
    Khushaba RN, Greenacre L, Kodagoda S, et al. Choice modeling and the brain: a study on the electroencephalogram (EEG) of preferences. Expert Syst Appl. 2012;39:12378–88.CrossRefGoogle Scholar
  50. 50.
    Hu J, Yu R. The neural correlates of the decoy effect in decisions. Front Behav Neurosci. 2014;8:271.PubMedPubMedCentralGoogle Scholar
  51. 51.
    Basten U, Biele G, Heekeren HR, et al. How the brain integrates costs and benefits during decision making. Proc Natl Acad Sci. 2010;107:21767–72.PubMedCrossRefPubMedCentralGoogle Scholar
  52. 52.
    Rolls ET, Grabenhorst F, Deco G. Choice, difficulty, and confidence in the brain. Neuroimage. 2010;53:694–706.PubMedCrossRefPubMedCentralGoogle Scholar
  53. 53.
    Kahnt T, Heinzle J, Park SQ, et al. Decoding different roles for vmPFC and dlPFC in multi-attribute decision making. Neuroimage. 2011;56:709–15.PubMedCrossRefPubMedCentralGoogle Scholar
  54. 54.
    Smith A, Douglas Bernheim B, Camerer CF, et al. Neural activity reveals preferences without choices. Am Econ J Microecon. 2014;6:1–36.PubMedPubMedCentralCrossRefGoogle Scholar
  55. 55.
    Lusk JL, Crespi JM, McFadden BR, et al. Neural antecedents of a random utility model. J Econ Behav Org. 2016;132:93–103.CrossRefGoogle Scholar
  56. 56.
    Lusk JL, Crespi JM, Cherry JBC, et al. An fMRI investigation of consumer choice regarding controversial food technologies. Food Qual Prefer. 2015;40:209–20.CrossRefGoogle Scholar
  57. 57.
    EMOTIV bioinformatics. San Francisco, USA: eMotiv. https://www.emotiv.com/.
  58. 58.
    Yale School of Medicine MRI Usage Charges. Yale University. http://mrrc.yale.edu/users/charges.aspx.
  59. 59.
    Ericsson K, Simon H. Protocol analysis: verbal reports as data (revised edition). Cambridge: MIT Press; 1993.Google Scholar
  60. 60.
    Boren T, Ramey J. Thinking aloud: reconciling theory and practice. IEEE Trans Prof Commun. 2000;43:261–78.CrossRefGoogle Scholar
  61. 61.
    Ryan M, Watson V, Entwistle V. Rationalising the ‘irrational’: a think aloud study of a discrete choice experiment responses. Health Econ. 2009;18:321–36.PubMedCrossRefPubMedCentralGoogle Scholar
  62. 62.
    Cheraghi-Sohi S, Bower P, Mead N, et al. Making sense of patient priorities: applying discrete choice methods in primary care using ‘think aloud’ technique. Fam Pract. 2007;24:276–82.PubMedCrossRefPubMedCentralGoogle Scholar
  63. 63.
    Cheraghi-Sohi S, Hole AR, Mead N, et al. What patients want from primary care consultations: a discrete choice experiment to identify patients’ priorities. Ann Fam Med. 2008;6:107–15.PubMedPubMedCentralCrossRefGoogle Scholar
  64. 64.
    Diorio C, Tomlinson D, Boydell KM, et al. Attitudes toward infection prophylaxis in pediatric oncology: a qualitative approach. PLoS ONE. 2012;7(10):e47815.PubMedPubMedCentralCrossRefGoogle Scholar
  65. 65.
    Whitty J, Walker R, Golenko X, et al. A think aloud study comparing the validity and acceptability of discrete choice and best worst scaling methods. PLoS One. 2014;9:e90635.PubMedPubMedCentralCrossRefGoogle Scholar
  66. 66.
    Grudniewicz A, Bhattacharyya O, McKibbon KA, et al. Redesigning printed educational materials for primary care physicians: design improvements increase usability. Implement Sci. 2015;10:156.PubMedPubMedCentralCrossRefGoogle Scholar
  67. 67.
    Vass CM, Rigby D, Payne K. The role of qualitative research methods in discrete choice experiments: a systematic review and survey of authors. Med Decis Mak. 2017;37:298–313.CrossRefGoogle Scholar
  68. 68.
    Mughal F, Posner J, McAteer H, et al. Comparing preferences for outcomes of psoriasis treatments among patients and dermatologists in the UK: results from a discrete-choice experiment. Br J Dermatol. 2016;176:777–85.PubMedPubMedCentralGoogle Scholar
  69. 69.
    Vass C, Rigby D, Payne K. “I was trying to do the maths”: exploring the impact of risk communication in discrete choice experiments. Patient. 2019;12:113–23.PubMedCrossRefPubMedCentralGoogle Scholar
  70. 70.
    Erdem S, Campbell D, Thompson C. Elimination and selection by aspects in health choice experiments: prioritising health service innovations. J Health Econ. 2014;38:10–22.PubMedCrossRefPubMedCentralGoogle Scholar
  71. 71.
    Conijn JM, van der Ark LA, Spinhoven P. Satisficing in mental health care patients: the effect of cognitive symptoms on self-report data quality. Assessment 2017;1–16.Google Scholar
  72. 72.
    de Bekker-Grob E, Chorus C. Random regret-based discrete-choice modelling: an application to healthcare. Pharmacoeconomics. 2013;31:623–34.PubMedCrossRefPubMedCentralGoogle Scholar
  73. 73.
    Lundgrén-Laine H, Salanterä S. Think-aloud technique and protocol analysis in clinical decision-making research. Qual Health Res. 2010;20:565–75.PubMedCrossRefPubMedCentralGoogle Scholar
  74. 74.
    Nvivo qualitative data analysis software, version 10. QSR International Pty Ltd.; 2014.Google Scholar
  75. 75.
    ATLAS.ti; Scientific Software Development GmbH, version 7; 2014.Google Scholar
  76. 76.
    Conrad FG, Blair J. Sources of error in cognitive interviews. Pub Opin Q. 2009;73:32–55.CrossRefGoogle Scholar
  77. 77.
    Ozdemir S. Improving the validity of stated-preference data in health research: the potential of the time-to-think approach. Patient. 2015;8:247–55.PubMedCrossRefPubMedCentralGoogle Scholar
  78. 78.
    Whittington D, Smith VK, Okorafor A, et al. Giving respondents time to think in contingent valuation studies: a developing country application. J Environ Econ Manag. 1992;22:205–25.CrossRefGoogle Scholar
  79. 79.
    Lauria DT, Whittington D, Kyeongae C, Turingan C, Abiad V. Household demand for improved sanitation services: a case study of Calamba, Philippines. In: Bateman IJ, Willis KG, editors. Valuing environmental preferences: theory and practice of the contingent valuation method in the US, EU, and developing countries. Oxford University Press; 2001. p. 540–81.Google Scholar
  80. 80.
    Cook J, Jeuland M, Maskery B, et al. Giving stated preference respondents ‘time to think’: results from four countries. Environ Resour Econ. 2012;51:473–96.CrossRefGoogle Scholar
  81. 81.
    Cook J, Whittington D, Canh DG, et al. Reliability of stated preferences for cholera and typhoid vaccines with time to think in Hue, Vietnam. Econ Inq. 2007;45:100–14.CrossRefGoogle Scholar
  82. 82.
    Veldwijk J, Viberg Johansson J, Donkers B, et al. Mimicking real life decision-making in health: allowing respondents time-to-think in a discrete choice experiment. Value Heal. 2017;20:A406.CrossRefGoogle Scholar
  83. 83.
    Tilley E, Logar I, Günther I. The effect of giving respondents time to think in a choice experiment: a conditional cash transfer programme in South Africa. Environ Dev Econ. 2017;22:202–27.CrossRefGoogle Scholar
  84. 84.
    Park J-W, Hastak M. Memory-based product judgments: effects of involvement at encoding and retrieval. J Consum Res. 1994;21:534.CrossRefGoogle Scholar
  85. 85.
    Aaker DA, Bagozzi RP, Carman JM, et al. On using response latency to measure preference. J Mark Res. 1980;17:237.CrossRefGoogle Scholar
  86. 86.
    Maclachlan J, Czepiel J, Labarbera P, et al. Implementation of response latency measures. Source J Mark Res J Mark Res. 1979;16:573–7.CrossRefGoogle Scholar
  87. 87.
    MacLachlan J, Myers JG. Using response latency to identify commercials that motivate. J Advert Res. 1983;23:51.Google Scholar
  88. 88.
    Tyebjee TT. Response time, conflict, and involvement in brand choice. J Consum Res. 1979;6:295.CrossRefGoogle Scholar
  89. 89.
    Bech M, Kjaer T, Lauridsen J. Does the number of choice sets matter? Results from a web survey applying a discrete choice experiment. Health Econ. 2011;20:273–86.PubMedCrossRefPubMedCentralGoogle Scholar
  90. 90.
    Börger T. Are fast responses more random? Testing the effect of response time on scale in an online choice experiment. Environ Resour Econ. 2016;65:389–413.CrossRefGoogle Scholar
  91. 91.
    Campbell D, Mørkbak MR, Olsen SB. The link between response time and preference, variance and processing heterogeneity in stated choice experiments. J Environ Econ Manag. 2018;88:18–34.CrossRefGoogle Scholar
  92. 92.
    Otter T, Allenby GM, van Zandt T. An integrated model of discrete choice and response time. J Mark Res. 2008;45:593–607.CrossRefGoogle Scholar
  93. 93.
    Malone T, Lusk JL. Releasing the trap: a method to reduce inattention bias in survey data with application to U.S. beer taxes. Econ Inq. 2019;57(1):584–99.CrossRefGoogle Scholar
  94. 94.
    Xu P, Ehinger KA, Zhang Y, et al. TurkerGaze: crowdsourcing saliency with webcam based eye tracking. arXiv:1504.Google Scholar
  95. 95.
    Bigné E, Llinares C, Torrecilla C. Elapsed time on first buying triggers brand choices within a category: a virtual reality-based study. J Bus Res. 2016;69:1423–7.CrossRefGoogle Scholar
  96. 96.
    Heidenreich S, Watson V, Ryan M, Phimister E. Decision heuristic or preference? Attribute non-attendance in discrete choice problems. Health Econ. 2018;27(1):157–71.PubMedCrossRefPubMedCentralGoogle Scholar
  97. 97.
    Campbell D, Hutchinson WG, Scarpa R. Incorporating discontinuous preferences into the analysis of discrete choice experiments. Environ Resour Econ. 2008;41:401–17.CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Economics, School of Social SciencesThe University of ManchesterManchesterUK
  2. 2.Division of Population Health, Health Services Research and Primary Care, Manchester Centre for Health EconomicsThe University of ManchesterManchesterUK

Personalised recommendations