Skip to main content

Advertisement

Log in

Opening the ‘Black Box’: An Overview of Methods to Investigate the Decision-Making Process in Choice-Based Surveys

  • Review Article
  • Published:
The Patient - Patient-Centered Outcomes Research Aims and scope Submit manuscript

Abstract

The desire to understand the preferences of patients, healthcare professionals and the public continues to grow. Health valuation studies, often in the form of discrete choice experiments, a choice based survey approach, proliferate as a result. A variety of methods of pre-choice process analysis have been developed to investigate how and why people make their decisions in such experiments and surveys. These techniques have been developed to investigate how people acquire and process information and make choices. These techniques offer the potential to test and improve theories of choice and/or associated empirical models. This paper provides an overview of such methods, with the focus on their use in stated choice-based healthcare studies. The methods reviewed are eye tracking, mouse tracing, brain imaging, deliberation time analysis and think aloud. For each method, we summarise the rationale, implementation, type of results generated and associated challenges, along with a discussion of possible future developments.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. Attribute non-attendance (ANA) occurs when variations in an attribute’s levels do not affect choices. This may be because the attribute, or the levels it takes in the survey, are irrelevant or because a simplifying heuristic is being implemented (for example, to reduce task complexity) [96]. The former may lead to deterministic decision making [61]; the latter violates the continuity of preferences axiom [97].

References

  1. de Bekker-Grob EW, Ryan M, Gerard K. Discrete choice experiments in health economics: a review of the literature. Health Econ. 2012;21:145–72.

    Article  PubMed  Google Scholar 

  2. Clark M, Determann D, Petrou S, et al. Discrete choice experiments in health economics: a review of the literature. Pharmacoeconomics. 2014;32:883–902.

    Article  PubMed  Google Scholar 

  3. Schulte-Mecklenbeck M, Johnson JG, Böckenholt U, et al. Process-tracing methods in decision making: on growing up in the 70s. Curr Dir Psychol Sci. 2017;26:442–50.

    Article  Google Scholar 

  4. Schlosser RW, Wendt O, Bhavnani S, et al. Use of information-seeking strategies for developing systematic reviews and engaging in evidence-based practice: the application of traditional and comprehensive Pearl Growing. A review. Int J Lang Commun Disord. 2006;41:567–82.

    Article  PubMed  Google Scholar 

  5. Hinde S, Spackman E. Bidirectional citation searching to completion: an exploration of literature searching methods. Pharmacoeconomics. 2014;33:5–11.

    Article  Google Scholar 

  6. Louviere JJ, Flynn TN, Marley AAJ. Best–worst scaling: theory, methods and applications. Cambridge: Cambridge University Press; 2015.

    Book  Google Scholar 

  7. Bialkova S, van Trijp HCM. An efficient methodology for assessing attention to and effect of nutrition information displayed front-of-pack. Food Qual Prefer. 2011;22:592–601.

    Article  Google Scholar 

  8. Duchowski AT. A breadth-first survey of eye-tracking applications. Behav Res Methods Instruments Comput. 2002;34:455–70.

    Article  Google Scholar 

  9. Kowler E, Anderson E, Dosher B, et al. The role of attention in the programming of saccades. Vision Res. 1995;35:1897–916.

    Article  CAS  PubMed  Google Scholar 

  10. van Beers RJ. The sources of variability in saccadic eye movements. J Neurosci. 2007;27:8757–70.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  11. Just M, Carpenter P. A theory of reading: from eye fixations to comprehension. Psychol Rev. 1980;87:329–54.

    Article  CAS  PubMed  Google Scholar 

  12. Orquin JL, Mueller Loose S. Attention and choice: a review on eye movements in decision making. Acta Psychol (Amst). 2013;144:190–206.

    Article  Google Scholar 

  13. Holmqvist K, Nyström M, Andersson R, et al. Eye tracking: a comprehensive guide to methods and measures. Oxford: Oxford University Press; 2011.

    Google Scholar 

  14. Raney GE, Campbell SJ, Bovee JC. Using eye movements to evaluate the cognitive processes involved in text comprehension. J Vis Exp. 2014;83:1–7.

    Google Scholar 

  15. Rayner K. Visual attention in reading: eye movements. Mem Cognit. 1977;5:443–8.

    Article  CAS  PubMed  Google Scholar 

  16. Krucien N, Ryan M, Hermens F. Visual attention in multi-attributes choices: what can eye-tracking tell us? J Econ Behav Organ. 2017;135:251–67.

    Article  Google Scholar 

  17. Ryan M, Krucien N, Hermens F. The eyes have it: using eye tracking to inform information processing strategies in multi-attributes choices. Health Econ. 2018;27:709–21.

    Article  PubMed  Google Scholar 

  18. Spinks J, Mortimer D. Lost in the crowd? Using eye-tracking to investigate the effect of complexity on attribute non-attendance in discrete choice experiments. BMC Med Inform Decis Mak. 2016;16:14.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Vass C, Rigby D, Tate K, et al. An exploratory application of eye-tracking methods in a discrete choice experiment. Med Decis Mak. 2018;38:658–72.

    Article  Google Scholar 

  20. Chavez D, Palma M, Collart A. Eye tracking to model attribute attendance. San Antonio: Southern Agricultural Economics Association; 2016.

    Google Scholar 

  21. Chen Y, Caputo V, Nayga RM, et al. How visual attention affects choice outcomes: an eyetracking study. In: 3rd International Winter Conference on Brain–Computer Interface, BCI 2015; 2015.

  22. Erdem S, McCarthy J. The effect of front-of-pack nutrition labelling formats on consumers’ food choices and decision-making: merging discrete choice experiment with an eye tracking experiment. Boston: Agricultural and Applied Economics Association; 2016.

    Google Scholar 

  23. Van Loo EJ, Caputo V, Nayga RM, et al. Sustainability labels on coffee: consumer preferences, willingness-to-pay and visual attention to attributes. Ecol Econ. 2015;118:215–25.

    Article  Google Scholar 

  24. Balcombe K, Fraser I, McSorley E. Visual attention and attribute attendance in multi-attribute choice experiments. J Appl Econom. 2014;30:1–27.

    Google Scholar 

  25. Grebitus C, Seitz C. Relationship between attention and choice. Naples: European Association of Agricultural Economists; 2014. p. 1–13.

    Google Scholar 

  26. Uggeldahl K, Jacobsen C, Lundhede TH, et al. Choice certainty in discrete choice experiments: will eye tracking provide useful measures? J Choice Model. 2016;20:35–48.

    Article  Google Scholar 

  27. Meißner M, Musalem A, Huber J. Eye tracking reveals processes that enable conjoint choices to become increasingly efficient with practice. J Mark Res. 2016;53:1–17.

    Article  Google Scholar 

  28. Oviedo JL, Caparrós A. Information and visual attention in contingent valuation and choice modeling: field and eye-tracking experiments applied to reforestations in Spain. J For Econ. 2015;21:185–204.

    Google Scholar 

  29. Rihn A, Khachatryan H, Campbell B, et al. Consumer preferences for organic production methods and origin promotions on ornamental plants: evidence from eye-tracking experiments. Agric Econ. 2016;47:599–608.

    Article  Google Scholar 

  30. Khushaba RN, Wise C, Kodagoda S, et al. Consumer neuroscience: assessing the brain response to marketing stimuli using electroencephalogram (EEG) and eye tracking. Expert Syst Appl. 2013;40:3803–12.

    Article  Google Scholar 

  31. Lancaster K. A new approach to consumer theory author. J Polit Econ. 1966;74:132–57.

    Article  Google Scholar 

  32. Arieli A, Ben-Ami Y, Rubinstein A. Fairness motivations and procedures of choice between lotteries as revealed through eye movements. Foerder Institute for Economic Research Working Papers 275720; 2009.

  33. Duchowski A. Eye tracking methodology: theory and practice. 2nd ed. New York: Springer; 2007.

    Google Scholar 

  34. Orquin JL, Ashby NJS, Clarke ADF. Areas of interest as a signal detection problem in behavioral eye-tracking research. J Behav Decis Mak. 2016;29:103–15.

    Article  Google Scholar 

  35. Horwitz R, Kreuter F, Conrad F. Using mouse movements to predict web survey response difficulty. Soc Sci Comput Rev. 2017;35:388–405.

    Article  Google Scholar 

  36. MouseFlow https://mouseflow.com/. Accessed 17 Aug 2018.

  37. MouseTracker http://www.mousetracker.org/.Accessed 11 Jun 2017.

  38. Franco-Watkins A, Johnson J. Applying the decision moving window to risky choice: comparison of eye-tracking and mousetracing methods. Judgm Decis Mak. 2011;6:740–9.

    Google Scholar 

  39. Gray E. Time preference for future health events. PhD Thesis, HERU, University of Aberdeen; 2012.

  40. Soekhai V, de Bekker-Grob EW, Ellis AR, et al. Discrete choice experiments in health economics: past, present and future. PharmacoEconomics. 2019;37:201–26.

    Article  PubMed  Google Scholar 

  41. Braeutigam S. Magnetoencephalography: fundamentals and established and emerging clinical applications in radiology. ISRN Radiol. 2013;12:529463.

    Google Scholar 

  42. Papanicolaou AC. Clinical Magnetoencephalography and magnetic source imaging. Cambridge: Cambridge University Press; 2009.

    Book  Google Scholar 

  43. Vecchiato G, Astolfi L, De Vico Fallani F, et al. On the use of EEG or MEG brain imaging tools in neuromarketing research. Comput Intell Neurosci. 2011;2011:643489.

    Google Scholar 

  44. Camerer C, Loewenstein G, Prelec D. Neuroeconomics: how neuroscience can inform economics. J Econ Lit. 2005;43:9–64.

    Article  Google Scholar 

  45. Upright MRI http://www.uprightmri.co.uk/. Accessed 7 Jun 2017.

  46. Hedgcock WM, Crowe DA, Leuthold AC, et al. A magnetoencephalography study of choice bias. Exp Brain Res. 2010;202:121–7.

    Article  PubMed  Google Scholar 

  47. Huber J, Payne JW, Puto C. Adding asymmetrically dominated alternatives: violations of regularity and the similarity hypothesis. J Consum Res. 1982;9:90.

    Article  Google Scholar 

  48. Khushaba RN, Kodagoda S, Dissanayake G, et al. A neuroscientific approach to choice modeling: electroencephalogram (EEG) and user preferences. In: Proceedings of the international joint conference on neural networks. 2012.

  49. Khushaba RN, Greenacre L, Kodagoda S, et al. Choice modeling and the brain: a study on the electroencephalogram (EEG) of preferences. Expert Syst Appl. 2012;39:12378–88.

    Article  Google Scholar 

  50. Hu J, Yu R. The neural correlates of the decoy effect in decisions. Front Behav Neurosci. 2014;8:271.

    PubMed  PubMed Central  Google Scholar 

  51. Basten U, Biele G, Heekeren HR, et al. How the brain integrates costs and benefits during decision making. Proc Natl Acad Sci. 2010;107:21767–72.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  52. Rolls ET, Grabenhorst F, Deco G. Choice, difficulty, and confidence in the brain. Neuroimage. 2010;53:694–706.

    Article  PubMed  Google Scholar 

  53. Kahnt T, Heinzle J, Park SQ, et al. Decoding different roles for vmPFC and dlPFC in multi-attribute decision making. Neuroimage. 2011;56:709–15.

    Article  PubMed  Google Scholar 

  54. Smith A, Douglas Bernheim B, Camerer CF, et al. Neural activity reveals preferences without choices. Am Econ J Microecon. 2014;6:1–36.

    Article  PubMed  PubMed Central  Google Scholar 

  55. Lusk JL, Crespi JM, McFadden BR, et al. Neural antecedents of a random utility model. J Econ Behav Org. 2016;132:93–103.

    Article  Google Scholar 

  56. Lusk JL, Crespi JM, Cherry JBC, et al. An fMRI investigation of consumer choice regarding controversial food technologies. Food Qual Prefer. 2015;40:209–20.

    Article  Google Scholar 

  57. EMOTIV bioinformatics. San Francisco, USA: eMotiv. https://www.emotiv.com/.

  58. Yale School of Medicine MRI Usage Charges. Yale University. http://mrrc.yale.edu/users/charges.aspx.

  59. Ericsson K, Simon H. Protocol analysis: verbal reports as data (revised edition). Cambridge: MIT Press; 1993.

    Google Scholar 

  60. Boren T, Ramey J. Thinking aloud: reconciling theory and practice. IEEE Trans Prof Commun. 2000;43:261–78.

    Article  Google Scholar 

  61. Ryan M, Watson V, Entwistle V. Rationalising the ‘irrational’: a think aloud study of a discrete choice experiment responses. Health Econ. 2009;18:321–36.

    Article  PubMed  Google Scholar 

  62. Cheraghi-Sohi S, Bower P, Mead N, et al. Making sense of patient priorities: applying discrete choice methods in primary care using ‘think aloud’ technique. Fam Pract. 2007;24:276–82.

    Article  PubMed  Google Scholar 

  63. Cheraghi-Sohi S, Hole AR, Mead N, et al. What patients want from primary care consultations: a discrete choice experiment to identify patients’ priorities. Ann Fam Med. 2008;6:107–15.

    Article  PubMed  PubMed Central  Google Scholar 

  64. Diorio C, Tomlinson D, Boydell KM, et al. Attitudes toward infection prophylaxis in pediatric oncology: a qualitative approach. PLoS ONE. 2012;7(10):e47815.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  65. Whitty J, Walker R, Golenko X, et al. A think aloud study comparing the validity and acceptability of discrete choice and best worst scaling methods. PLoS One. 2014;9:e90635.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  66. Grudniewicz A, Bhattacharyya O, McKibbon KA, et al. Redesigning printed educational materials for primary care physicians: design improvements increase usability. Implement Sci. 2015;10:156.

    Article  PubMed  PubMed Central  Google Scholar 

  67. Vass CM, Rigby D, Payne K. The role of qualitative research methods in discrete choice experiments: a systematic review and survey of authors. Med Decis Mak. 2017;37:298–313.

    Article  Google Scholar 

  68. Mughal F, Posner J, McAteer H, et al. Comparing preferences for outcomes of psoriasis treatments among patients and dermatologists in the UK: results from a discrete-choice experiment. Br J Dermatol. 2016;176:777–85.

    PubMed  Google Scholar 

  69. Vass C, Rigby D, Payne K. “I was trying to do the maths”: exploring the impact of risk communication in discrete choice experiments. Patient. 2019;12:113–23.

    Article  PubMed  Google Scholar 

  70. Erdem S, Campbell D, Thompson C. Elimination and selection by aspects in health choice experiments: prioritising health service innovations. J Health Econ. 2014;38:10–22.

    Article  PubMed  Google Scholar 

  71. Conijn JM, van der Ark LA, Spinhoven P. Satisficing in mental health care patients: the effect of cognitive symptoms on self-report data quality. Assessment 2017;1–16.

  72. de Bekker-Grob E, Chorus C. Random regret-based discrete-choice modelling: an application to healthcare. Pharmacoeconomics. 2013;31:623–34.

    Article  PubMed  Google Scholar 

  73. Lundgrén-Laine H, Salanterä S. Think-aloud technique and protocol analysis in clinical decision-making research. Qual Health Res. 2010;20:565–75.

    Article  PubMed  Google Scholar 

  74. Nvivo qualitative data analysis software, version 10. QSR International Pty Ltd.; 2014.

  75. ATLAS.ti; Scientific Software Development GmbH, version 7; 2014.

  76. Conrad FG, Blair J. Sources of error in cognitive interviews. Pub Opin Q. 2009;73:32–55.

    Article  Google Scholar 

  77. Ozdemir S. Improving the validity of stated-preference data in health research: the potential of the time-to-think approach. Patient. 2015;8:247–55.

    Article  PubMed  Google Scholar 

  78. Whittington D, Smith VK, Okorafor A, et al. Giving respondents time to think in contingent valuation studies: a developing country application. J Environ Econ Manag. 1992;22:205–25.

    Article  Google Scholar 

  79. Lauria DT, Whittington D, Kyeongae C, Turingan C, Abiad V. Household demand for improved sanitation services: a case study of Calamba, Philippines. In: Bateman IJ, Willis KG, editors. Valuing environmental preferences: theory and practice of the contingent valuation method in the US, EU, and developing countries. Oxford University Press; 2001. p. 540–81.

  80. Cook J, Jeuland M, Maskery B, et al. Giving stated preference respondents ‘time to think’: results from four countries. Environ Resour Econ. 2012;51:473–96.

    Article  Google Scholar 

  81. Cook J, Whittington D, Canh DG, et al. Reliability of stated preferences for cholera and typhoid vaccines with time to think in Hue, Vietnam. Econ Inq. 2007;45:100–14.

    Article  Google Scholar 

  82. Veldwijk J, Viberg Johansson J, Donkers B, et al. Mimicking real life decision-making in health: allowing respondents time-to-think in a discrete choice experiment. Value Heal. 2017;20:A406.

    Article  Google Scholar 

  83. Tilley E, Logar I, Günther I. The effect of giving respondents time to think in a choice experiment: a conditional cash transfer programme in South Africa. Environ Dev Econ. 2017;22:202–27.

    Article  Google Scholar 

  84. Park J-W, Hastak M. Memory-based product judgments: effects of involvement at encoding and retrieval. J Consum Res. 1994;21:534.

    Article  Google Scholar 

  85. Aaker DA, Bagozzi RP, Carman JM, et al. On using response latency to measure preference. J Mark Res. 1980;17:237.

    Article  Google Scholar 

  86. Maclachlan J, Czepiel J, Labarbera P, et al. Implementation of response latency measures. Source J Mark Res J Mark Res. 1979;16:573–7.

    Article  Google Scholar 

  87. MacLachlan J, Myers JG. Using response latency to identify commercials that motivate. J Advert Res. 1983;23:51.

    Google Scholar 

  88. Tyebjee TT. Response time, conflict, and involvement in brand choice. J Consum Res. 1979;6:295.

    Article  Google Scholar 

  89. Bech M, Kjaer T, Lauridsen J. Does the number of choice sets matter? Results from a web survey applying a discrete choice experiment. Health Econ. 2011;20:273–86.

    Article  PubMed  Google Scholar 

  90. Börger T. Are fast responses more random? Testing the effect of response time on scale in an online choice experiment. Environ Resour Econ. 2016;65:389–413.

    Article  Google Scholar 

  91. Campbell D, Mørkbak MR, Olsen SB. The link between response time and preference, variance and processing heterogeneity in stated choice experiments. J Environ Econ Manag. 2018;88:18–34.

    Article  Google Scholar 

  92. Otter T, Allenby GM, van Zandt T. An integrated model of discrete choice and response time. J Mark Res. 2008;45:593–607.

    Article  Google Scholar 

  93. Malone T, Lusk JL. Releasing the trap: a method to reduce inattention bias in survey data with application to U.S. beer taxes. Econ Inq. 2019;57(1):584–99.

    Article  Google Scholar 

  94. Xu P, Ehinger KA, Zhang Y, et al. TurkerGaze: crowdsourcing saliency with webcam based eye tracking. arXiv:1504.

  95. Bigné E, Llinares C, Torrecilla C. Elapsed time on first buying triggers brand choices within a category: a virtual reality-based study. J Bus Res. 2016;69:1423–7.

    Article  Google Scholar 

  96. Heidenreich S, Watson V, Ryan M, Phimister E. Decision heuristic or preference? Attribute non-attendance in discrete choice problems. Health Econ. 2018;27(1):157–71.

    Article  PubMed  Google Scholar 

  97. Campbell D, Hutchinson WG, Scarpa R. Incorporating discontinuous preferences into the analysis of discrete choice experiments. Environ Resour Econ. 2008;41:401–17.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Contributions

The nature and scope of the paper was developed by DR, CV and KP. CV led the search and drafting process for eye-tracking, mouse tracing and think-aloud sections; DR led this work for the deliberation time section; and KP and DR led this work for the brain imaging section. All authors provided critical review of the draft of the final manuscript.

Corresponding author

Correspondence to Dan Rigby.

Ethics declarations

Funding

Caroline M. Vass and Katherine Payne were supported in the preparation and submission of this article by Mind the Risk international network collaboration funded by the Swedish Foundation for Humanities and Social Sciences. The views and opinions expressed are those of the authors, and not necessarily those of other Mind the Risk members or the Swedish Foundation for Humanities and Social Sciences.

Conflicts of interest

Dan Rigby, Caroline Vass and Katherine Payne have no conflicts of interest that are relevant to the content of this article.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Rigby, D., Vass, C. & Payne, K. Opening the ‘Black Box’: An Overview of Methods to Investigate the Decision-Making Process in Choice-Based Surveys. Patient 13, 31–41 (2020). https://doi.org/10.1007/s40271-019-00385-8

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s40271-019-00385-8

Navigation