New methodologies that aim to collect data in innovative ways (e.g. big data) are putting pressure on the traditional surveys based on questionnaires. In order to obtain from respondents the necessary effort to provide high quality data, we propose to help respondents focusing on the tasks one at the time by using a dynamic presentation of web questionnaires combined with a timing control. This work explores a development strategy for such questionnaires. The results show an overall good survey experience, and some significant improvement of data quality. In particular, a much larger proportion of respondents followed properly specific instructions added to check that they were reading carefully. Therefore, it seems that these dynamic questionnaires could be considered for future data-collection.
This is a preview of subscription content, access via your institution.
Buy single article
Instant access to the full article PDF.
Tax calculation will be finalised during checkout.
We should note that most of these possibilities are also available in computer-assisted surveys. However, as this study has been implemented with Internet surveys, we will maintain the terminology of Internet surveys throughout the paper.
Some studies tried to sensitize respondents to the importance of their answers to make them think more deeply about their responses, but it does not seem to be very efficient (see, for instance, Revilla 2015).
The instruction was: “select this category beside the five options you think are most important”. It was placed in the 9th row of a grid where people had to select five options they found most important in each of three different columns.
There was a third open narrative question in the survey but it was related to the survey experience, so it is possible that the respondents in the treatment groups wrote more because they had more to say. For this reason, we did not use it here.
Behr, D., Bandilla, W., Kaczmirek, L., Braun, M.: Cognitive probes in web surveys: on the effect of different text box size and probing exposure on response quality. Soc. Sci. Comput. Rev. 32(4), 524–533 (2014). doi:10.1177/0894439313485203
Bethlehem, J., Biffignandi, S.: Handbook of Web Surveys, vol. 567. Wiley, Hoboken (2011)
Conrad, F.G., Couper, M.P., Tourangeau, R., Galesic, M.: Interactive feedback can improve the quality of responses in web surveys. Paper presented at the 60th annual conference of the American Association for Public Opinion Research (AAPOR), 12–15 May 2005, Miami Beach
Conrad, F.G., Schober, M.F.: New frontiers in standardized survey interviewing. In: Hesse-Biber, S.N., Leavey, P. (eds.) Handbook of Emergent Methods, pp. 173–188. Guilford Press, New York (2008)
Conrad, F.G., Schober, M.F., Coiner, T.: Bringing features of human dialogue to web surveys. Appl. Cogn. Psychol. 21(2), 165–187 (2007)
Conrad, F.G., Schober, M.F., Jans, M., Orlowski, R.A., Nielsen, D., Levenstein, R.: Comprehension and engagement in survey interviews with virtual agents. Front. Psychol. 6, 1578 (2015). doi:10.3389/fpsyg.2015.01578
Couper, M.P.: Web surveys: a review of issues and approaches. Public Opin. Q. 64, 464–494 (2000)
Couper, M.P., Tourangeau, R., Conrad, F.G., Zhang, C.: The design of grids in web surveys. Soc. Sci. Comput. Rev. 31(3), 322–345 (2013)
Green, P.E., Rao, V.R.: Conjoint measurement for quantifying judgmental data. J. Mark. Res. 8(3), 355–363 (1971)
Green, P.E., Srinivasan, V.: Conjoint analysis in marketing: new developments with implications for research and practice. J. Mark. 54(4), 3–19 (1990)
Green, P.E., Krieger, A.M., Wind, Y.: Thirty years of conjoint analysis: reflections and prospects. Interfaces 31(3), 556–573 (2001). doi:10.1287/inte.31.3s.56.9676
Greszki, R., Meyer, M., Schoen, H.: The impact of speeding on data quality in nonprobability and freshly recruited probability-based online panels. In: Callegaro, M., Baker, R., Bethlehem, J., Göritz, A.S., Krosnick, J.A., Lavrakas, P.J. (eds.) Online Panel Research: A Data Quality Perspective, pp. 238–262. Wiley, Chichester (2014)
Holland, J.L., Christian, L.M.: The influence of topic interest and interactive probing on responses to open-ended questions in web surveys. Soc. Sci. Comput. Rev. 27(2), 196–212 (2009). doi:10.1177/0894439308327481
Kapelner, A., Chandler, D.: Preventing satisficing in online surveys. In Proceedings of 2010 CrowdConf
Kieruj, N.D., Mulder, J., Wijnant, A., Douhou, S., Conrad, F.: Hearing voices: supporting online questionnaires with Text-to-Speech technology. Presented at the European Survey Research Association (ESRA) conference 2013, ESRA, Ljubljana
Krosnick, J.A.: Response strategies for coping with the cognitive demands of attitude measures in surveys. Appl. Cogn. Psychol. 5, 213–236 (1991). doi:10.1002/acp.2350050305
Krosnick, J.A., Alwin, D.F.: An evaluation of a cognitive theory of response order effects in survey measurement. Public Opin. Q. 51(2):201–219 (1987). http://www.jstor.org/stable/2748993
Malhotra, N.: Completion time and response order effects in web surveys. Public Opin. Q. 72(5), 914–934 (2008)
Menold, N., Winker, P., Storfinger, N., Bredl, S.: Interviewers’ falsifications in face-to-face surveys. In: Engel, U., Jann, B., Lynn, P., Scherpenzeel, A., Sturgis, P. (eds.) Improving Survey Methods: Lessons from Recent Research, pp. 86–97. Taylor & Francis, New York (2014)
Oppenheimer, D.M., Meyvis, T., Davidenko, N.: Instructional manipulation checks: detecting satisficing to increase statistical power. J. Exp. Soc. Psychol. 45, 867–872 (2009). doi:10.1016/j.jesp.2009.03.009
Oudejans, M., Christian, L.M.: Using interactive features to motivate and probe responses to open-ended questions. In: Das, M., Ester, P., Kaczmirek, L. (eds.) Social and Behavioral Research and the Internet, pp. 215–244. Routledge, New York (2011)
Revilla, M. (2015). Impact of raising awareness of respondents on the measurement quality in a web survey. Qual. Quant. (First published online on May 9, 2015). doi:10.1007/s11135-015-0216-y
Revilla, M., Ochoa, C.: What are the links in a web survey among response time, quality, and auto-evaluation of the efforts done? Soc. Sci. Comput. Rev. 33(1):97–114 (2015) (First published online on May 14, 2014). doi:10.1177/0894439314531214
Revilla, M., Toninelli, D., Ochoa, C., Loewe, G.: Do online access panels really need to allow and adapt surveys to mobile devices? Internet Res. 26(5), (2016)
Schuman, H., Presser, S.: Questions and Answers in Attitude Surveys: Experiments on Question Form, Wording, and Context. Academic Press, New York (1981)
Simon, H.A.: Models of Man. Wiley, New York (1957)
Wikipedia: “Speed reading”. Section “Claims of speed readers”. http://en.wikipedia.org/wiki/Speed_reading#Claims_of_speed_readers
Zhang, C.: Satisficing in web surveys: implications for data quality and strategies for reduction. Doctoral dissertation, University of Michigan (2013). http://deepblue.lib.umich.edu/handle/2027.42/97990
Zhang, C., Conrad, F.: Speeding in web surveys: the tendency to answer very fast and its association with straightlining. Surv. Res. Methods 8(2), 127–135 (2014)
We would like to thank Germán Loewe and Willem Saris for their support and their helpful comments on a previous draft of this paper, as well as two anonymous reviewers.
About this article
Cite this article
Revilla, M., Ochoa, C. & Turbina, A. Making use of Internet interactivity to propose a dynamic presentation of web questionnaires. Qual Quant 51, 1321–1336 (2017). https://doi.org/10.1007/s11135-016-0333-2
- Web surveys dynamic questionnaires
- Response times
- Survey experience