Advertisement

Quality & Quantity

, Volume 51, Issue 3, pp 1321–1336 | Cite as

Making use of Internet interactivity to propose a dynamic presentation of web questionnaires

  • Melanie RevillaEmail author
  • Carlos Ochoa
  • Albert Turbina
Article

Abstract

New methodologies that aim to collect data in innovative ways (e.g. big data) are putting pressure on the traditional surveys based on questionnaires. In order to obtain from respondents the necessary effort to provide high quality data, we propose to help respondents focusing on the tasks one at the time by using a dynamic presentation of web questionnaires combined with a timing control. This work explores a development strategy for such questionnaires. The results show an overall good survey experience, and some significant improvement of data quality. In particular, a much larger proportion of respondents followed properly specific instructions added to check that they were reading carefully. Therefore, it seems that these dynamic questionnaires could be considered for future data-collection.

Keywords

Web surveys dynamic questionnaires Quality Response times Survey experience Netquest 

Notes

Acknowledgments

We would like to thank Germán Loewe and Willem Saris for their support and their helpful comments on a previous draft of this paper, as well as two anonymous reviewers.

References

  1. Behr, D., Bandilla, W., Kaczmirek, L., Braun, M.: Cognitive probes in web surveys: on the effect of different text box size and probing exposure on response quality. Soc. Sci. Comput. Rev. 32(4), 524–533 (2014). doi: 10.1177/0894439313485203 CrossRefGoogle Scholar
  2. Bethlehem, J., Biffignandi, S.: Handbook of Web Surveys, vol. 567. Wiley, Hoboken (2011)CrossRefGoogle Scholar
  3. Conrad, F.G., Couper, M.P., Tourangeau, R., Galesic, M.: Interactive feedback can improve the quality of responses in web surveys. Paper presented at the 60th annual conference of the American Association for Public Opinion Research (AAPOR), 12–15 May 2005, Miami BeachGoogle Scholar
  4. Conrad, F.G., Schober, M.F.: New frontiers in standardized survey interviewing. In: Hesse-Biber, S.N., Leavey, P. (eds.) Handbook of Emergent Methods, pp. 173–188. Guilford Press, New York (2008)Google Scholar
  5. Conrad, F.G., Schober, M.F., Coiner, T.: Bringing features of human dialogue to web surveys. Appl. Cogn. Psychol. 21(2), 165–187 (2007)CrossRefGoogle Scholar
  6. Conrad, F.G., Schober, M.F., Jans, M., Orlowski, R.A., Nielsen, D., Levenstein, R.: Comprehension and engagement in survey interviews with virtual agents. Front. Psychol. 6, 1578 (2015). doi: 10.3389/fpsyg.2015.01578 CrossRefGoogle Scholar
  7. Couper, M.P.: Web surveys: a review of issues and approaches. Public Opin. Q. 64, 464–494 (2000)CrossRefGoogle Scholar
  8. Couper, M.P., Tourangeau, R., Conrad, F.G., Zhang, C.: The design of grids in web surveys. Soc. Sci. Comput. Rev. 31(3), 322–345 (2013)CrossRefGoogle Scholar
  9. Green, P.E., Rao, V.R.: Conjoint measurement for quantifying judgmental data. J. Mark. Res. 8(3), 355–363 (1971)Google Scholar
  10. Green, P.E., Srinivasan, V.: Conjoint analysis in marketing: new developments with implications for research and practice. J. Mark. 54(4), 3–19 (1990)CrossRefGoogle Scholar
  11. Green, P.E., Krieger, A.M., Wind, Y.: Thirty years of conjoint analysis: reflections and prospects. Interfaces 31(3), 556–573 (2001). doi: 10.1287/inte.31.3s.56.9676 CrossRefGoogle Scholar
  12. Greszki, R., Meyer, M., Schoen, H.: The impact of speeding on data quality in nonprobability and freshly recruited probability-based online panels. In: Callegaro, M., Baker, R., Bethlehem, J., Göritz, A.S., Krosnick, J.A., Lavrakas, P.J. (eds.) Online Panel Research: A Data Quality Perspective, pp. 238–262. Wiley, Chichester (2014)CrossRefGoogle Scholar
  13. Holland, J.L., Christian, L.M.: The influence of topic interest and interactive probing on responses to open-ended questions in web surveys. Soc. Sci. Comput. Rev. 27(2), 196–212 (2009). doi: 10.1177/0894439308327481 CrossRefGoogle Scholar
  14. Kapelner, A., Chandler, D.: Preventing satisficing in online surveys. In Proceedings of 2010 CrowdConfGoogle Scholar
  15. Kieruj, N.D., Mulder, J., Wijnant, A., Douhou, S., Conrad, F.: Hearing voices: supporting online questionnaires with Text-to-Speech technology. Presented at the European Survey Research Association (ESRA) conference 2013, ESRA, LjubljanaGoogle Scholar
  16. Krosnick, J.A.: Response strategies for coping with the cognitive demands of attitude measures in surveys. Appl. Cogn. Psychol. 5, 213–236 (1991). doi: 10.1002/acp.2350050305 CrossRefGoogle Scholar
  17. Krosnick, J.A., Alwin, D.F.: An evaluation of a cognitive theory of response order effects in survey measurement. Public Opin. Q. 51(2):201–219 (1987). http://www.jstor.org/stable/2748993
  18. Malhotra, N.: Completion time and response order effects in web surveys. Public Opin. Q. 72(5), 914–934 (2008)CrossRefGoogle Scholar
  19. Menold, N., Winker, P., Storfinger, N., Bredl, S.: Interviewers’ falsifications in face-to-face surveys. In: Engel, U., Jann, B., Lynn, P., Scherpenzeel, A., Sturgis, P. (eds.) Improving Survey Methods: Lessons from Recent Research, pp. 86–97. Taylor & Francis, New York (2014)Google Scholar
  20. Oppenheimer, D.M., Meyvis, T., Davidenko, N.: Instructional manipulation checks: detecting satisficing to increase statistical power. J. Exp. Soc. Psychol. 45, 867–872 (2009). doi: 10.1016/j.jesp.2009.03.009 CrossRefGoogle Scholar
  21. Oudejans, M., Christian, L.M.: Using interactive features to motivate and probe responses to open-ended questions. In: Das, M., Ester, P., Kaczmirek, L. (eds.) Social and Behavioral Research and the Internet, pp. 215–244. Routledge, New York (2011)Google Scholar
  22. Revilla, M. (2015). Impact of raising awareness of respondents on the measurement quality in a web survey. Qual. Quant. (First published online on May 9, 2015). doi: 10.1007/s11135-015-0216-y
  23. Revilla, M., Ochoa, C.: What are the links in a web survey among response time, quality, and auto-evaluation of the efforts done? Soc. Sci. Comput. Rev. 33(1):97–114 (2015) (First published online on May 14, 2014). doi: 10.1177/0894439314531214
  24. Revilla, M., Toninelli, D., Ochoa, C., Loewe, G.: Do online access panels really need to allow and adapt surveys to mobile devices? Internet Res. 26(5), (2016)Google Scholar
  25. Schuman, H., Presser, S.: Questions and Answers in Attitude Surveys: Experiments on Question Form, Wording, and Context. Academic Press, New York (1981)Google Scholar
  26. Simon, H.A.: Models of Man. Wiley, New York (1957)Google Scholar
  27. Wikipedia: “Speed reading”. Section “Claims of speed readers”. http://en.wikipedia.org/wiki/Speed_reading#Claims_of_speed_readers
  28. Zhang, C.: Satisficing in web surveys: implications for data quality and strategies for reduction. Doctoral dissertation, University of Michigan (2013). http://deepblue.lib.umich.edu/handle/2027.42/97990
  29. Zhang, C., Conrad, F.: Speeding in web surveys: the tendency to answer very fast and its association with straightlining. Surv. Res. Methods 8(2), 127–135 (2014)Google Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2016

Authors and Affiliations

  1. 1.RECSM - Universitat Pompeu FabraBarcelonaSpain
  2. 2.NetquestBarcelonaSpain

Personalised recommendations