Skip to main content

Are Videos or Text Better for Describing Attributes in Stated-Preference Surveys?

Abstract

Objective

In stated-preference research, the conventional approach to describing study attributes is through text, often with easy-to-understand graphics. More recently, researchers have begun to present attribute descriptions and content in videos. Some experts have expressed concern regarding internalization and retention of information conveyed via video.

Objective

Our study aimed to compare respondents’ understanding of attribute information provided via text versus video.

Methods

Potential respondents were randomized to receive a text or video version of the survey. In the text version, all content was provided in text format along with still graphics. In the video version, text content was interspersed with four video clips, providing the same information as the text version. In both versions, 10 questions were embedded to assess respondents’ understanding of the information presented relating to ovarian cancer treatments. Half of the questions were on treatment benefits and the other half were on treatment-related risks. Some questions asked about the decision context and definitions of treatment features, and others asked about the graphic presentation of treatment features. Preferences for ovarian cancer treatments were also compared between respondents receiving text versus video versions.

Results

Overall, 150 respondents were recruited. Of the 95 who were eligible and completed the survey, 54 respondents received the text version and 41 received the video version. Median times to completion were 24 and 30 min in the video and text arms, respectively (p < 0.01). Both groups spent an average of 35 min completing the survey. On the first comprehension question, 43% in the text arm and 61% in the video arm provided the correct response (p = 0.08). Although the mean number of correct responses was significantly higher in the video versus text arms (9.1 vs. 8.6, p = 0.02), there were no systematic differences in preferences between arms.

Conclusions

The quality of stated-preference data relies on respondents’ understanding of study content. Information provided via video may better engage survey participants and improve their retention of content.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Data Availability Statement

The datasets generated and analyzed for the current study are available from the corresponding author on reasonable request.

References

  1. Frank L, Basch E, Selby JV. The PCORI perspective on patient-centered outcomes research. JAMA. 2014;312(15):1513–4.

    CAS  PubMed  Article  Google Scholar 

  2. Ryan M, Farrar S. Using conjoint analysis to elicit preferences for health care. BMJ. 2000;320(7248):1530–3.

    CAS  PubMed  PubMed Central  Article  Google Scholar 

  3. de Bekker-Grob EW, Ryan M, Gerard K. Discrete choice experiments in health economics: a review of the literature. Health Econ. 2012;21(2):145–72.

    PubMed  Article  Google Scholar 

  4. Soekhai V, de Bekker-Grob EW, Ellis AR, Vass CM. Discrete choice experiments in health economics: past, present and future. PharmacoEconomics. 2019;37(2):201–26.

    PubMed  Article  Google Scholar 

  5. Brown TC. Introduction to stated preference methods. In: Champ PS, Boyle KJ, Brown TC, editors. A primer on nonmarket valuation the economics of non-market goods and resources. Dordrecht: Springer; 2003.

    Google Scholar 

  6. Winston K, Grendarova P, Rabi D. Video-based patient decision aids: a scoping review. Patient Educ Couns. 2018;101(4):558–78.

    PubMed  Article  Google Scholar 

  7. Fountain CR, Havrilesky LJ. Promoting same-day discharge for gynecologic oncology patients in minimally invasive hysterectomy. J Minim Invas Gynecol. 2017;24(6):932–9.

    Article  Google Scholar 

  8. Davidson BA, Ehrisman J, Reed SD, Yang JC, Buchanan A, Havrilesky LJ. Preferences of women with epithelial ovarian cancer for aspects of genetic testing. Gynecol Oncol Res Pract. 2019;6:1.

    PubMed  PubMed Central  Article  Google Scholar 

  9. Havrilesky LJ, Alvarez Secord A, Ehrisman JA, Berchuck A, Valea FA, Lee PS, et al. Patient preferences in advanced or recurrent ovarian cancer. Cancer. 2014;120(23):3651–9.

    PubMed  PubMed Central  Article  Google Scholar 

  10. Vass CM, Davison NJ, Vander Stichele G, Payne K. A picture is worth a thousand words: the role of survey training materials in stated-preference studies. Patient. 2020;13(2):162–73.

    Article  Google Scholar 

  11. Havrilesky LJ, Lim S, Ehrisman JA, Lorenzo A, Alvarez Secord A, Yang JC, et al. Patient preferences for maintenance PARP inhibitor therapy in ovarian cancer treatment. Gynecol Oncol. 2020;156(3):561–7.

    CAS  PubMed  Article  Google Scholar 

  12. Mangham LJ, Hanson K, McPake B. How to do (or not to do) … Designing a discrete choice experiment for application in a low-income country. Health Policy Plan. 2008;24(2):151–8.

    PubMed  Article  Google Scholar 

  13. Diaz JA, Griffith RA, Ng JJ, Reinert SE, Friedmann PD, Moulton AW. Patients' use of the internet for medical information. J Gen Intern Med. 2002;17(3):180–5.

    PubMed  PubMed Central  Article  Google Scholar 

  14. Tan SS-L, Goonawardene N. Internet health information seeking and the patient-physician relationship: a systematic review. J Med Internet Res. 2017;19(1):9.

    Article  Google Scholar 

Download references

Author information

Affiliations

Authors

Contributions

SL made a substantial contribution to the conception and hypothesis generation, data acquisition, table and figure preparation, and was primarily responsible for manuscript writing. LH and SR contributed to the conception, hypothesis generation, study design, data analysis, and critical evaluation of tables, figures, and manuscript writing. J-CY and JE contributed to study design, data acquisition, data analysis, and critical evaluation of the manuscript.

Corresponding author

Correspondence to Shelby D. Reed.

Ethics declarations

Funding

This manuscript reports on a substudy that was performed as part of a patient preference study funded by an investigator-initiated research grant from Astra Zeneca (Principal Investigators: LJH and SDR).

Conflicts of interest

Laura J. Havrilesky and Shelby D. Reed received research Grant support from AstraZeneca for the conduct of the parent study. Stephanie L. Lim, Jui-Chen Yang, and Jessie Ehrisman report no conflicts of interest.

Ethical standards

This manuscript is a substudy within the parent study.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Lim, S.L., Yang, JC., Ehrisman, J. et al. Are Videos or Text Better for Describing Attributes in Stated-Preference Surveys?. Patient 13, 401–408 (2020). https://doi.org/10.1007/s40271-020-00416-9

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s40271-020-00416-9