Advertisement

Quality & Quantity

, Volume 50, Issue 4, pp 1469–1486 | Cite as

Impact of raising awareness of respondents on the measurement quality in a web survey

  • Melanie RevillaEmail author
Article

Abstract

Web surveys are very attractive because they offer the possibility to collect a huge amount of data in a very short time, but they also suffer from several problems (Reips in Psychology experiments on the internet, 2000). In particular, in web panels where incentives are used to encourage the respondents’ participation, some of them answer the questions so fast that it seems impossible that they have carefully read the questions, or thought about the best answers. This paper shows the results of two experiments conducted with the online fieldwork provider Netquest, that try to reduce the speeding behaviours and get a better quality of answers by raising awareness among the respondents about the importance of completing a survey thoughtfully. The results show no effect on respondents who have been given an introductory reminder or a single commitment statement about the importance of their answers. However, when this introduction is combined with the commitment statement, a small effect on some respondents’ behaviours is found. Mainly the respondents who already put some effort in answering but not the maximum ones were affected. Participants with low quality answers continued giving low quality answers even when they were committed to do their best. From these results, we may conclude that more radical solutions than raising awareness may be necessary to assure that respondents in web surveys read the questions carefully and answer them the best way they can.

Keywords

Online surveys Motivational message Commitment Time of completion Quality of answers Satisficing 

Notes

Acknowledgments

I am very grateful for the permission I got to use the questionnaire of the Reputation Institute in Spain (http://www.reputationinstitute.com) for one of our experiments and a questionnaire of Provokers in Mexico for the other experiment (http://www.provokersite.com). I also appreciate very much the careful way in which Netquest performed the data collection and all the help given by different persons of the team, in particular Carlos Ochoa and Salvador Masdeu.

References

  1. Aust, F., Diedenhofen, B., Ullrich, S., Musch, J.: Seriousness checks are useful to improve data validity in online research. Behav. Res. Methods 45(2), 527–535 (2013). doi: 10.3758/s13428-012-0265-2 CrossRefGoogle Scholar
  2. Baker, R., Blumberg, S., Brick, J., Couper, M., Courtright, M., Dennis, J., Zahs, D.: American association of public opinion researchers report on online panels. Public Opin. Q. 74, 711–781 (2010)CrossRefGoogle Scholar
  3. Bethlehem, J.: Selection bias in web surveys. Int. Stat. Rev. 78(2), 161–188 (2010). doi: 10.1111/j.1751-5823.2010.00112.x CrossRefGoogle Scholar
  4. Bosnjak, M., Tuten, T.L.: Prepaid and promised incentives in web surveys. An experiment. Soc. Sci. Comput. Rev. 21(2), 208–217 (2003). doi: 10.1177/0894439303021002006 CrossRefGoogle Scholar
  5. Cannell, C.F., Oksenberg, L., Converse, J.M.: Striving for response accuracy: experiments in new interviewing techniques. J. Mark. Res. 14, 306–315 (1977)CrossRefGoogle Scholar
  6. Cobanoglu, C., Cobanoglu, N.: The effect of incentives in web surveys: application and ethical considerations. Int. J. Mark. Res. 45(4), 475–488 (2003)Google Scholar
  7. Conrad, F.G., Couper, M.P., Tourangeau, R., Galesic, M.: Interactive feedback can improve the quality of responses in web surveys. Paper presented at the conference of the American Association for Public Opinion Research, Miami Beach, Florida. AAPOR—ASA section on survey research methods. https://www.amstat.org/Sections/Srms/Proceedings/y2005/Files/JSM2005-000938.pdf (2005)
  8. Cook, C., Heath, F., Thompson, R.L.: A meta-analysis of response rates in web- or internet-based surveys. Educ. Psychol. Meas. 60(6), 821–836 (2000). doi: 10.1177/00131640021970934 CrossRefGoogle Scholar
  9. Couper, M.P.: Web surveys: a review of issues and approaches. Public Opin. Q. 64, 464–494. http://www.jstor.org/stable/3078739 (2000)
  10. Couper, M.P., Tourangeau, R., Kenyon, K.: Picture this! Exploring visual effects in web surveys. Public Opin. Q. 68(2), 255–266 (2004). doi: 10.1093/poq/nfh013 CrossRefGoogle Scholar
  11. Couper, M.P., Traugott, M.W., Lamias, M.J.: Web survey design and administration. Public Opin. Q. 65(2), 230–253 (2001). doi: 10.1086/322199 CrossRefGoogle Scholar
  12. Dillman, D.A., Tortora, R.D., Bowker, D.: Principles for constructing web surveys. SESRC, Washington. http://www.isurveys.com.au/resources/ppr.pdf (1999)
  13. Dillman, D.A., Tortora, R.D., Conradt, J., Bowker, D.: Influence of plain versus fancy design on response rates for web surveys. In: Proceedings of the American Statistical Associations Survey Methods Research Section, Washington DC (1998)Google Scholar
  14. Göritz, A.S.: Incentives in web studies: methodological issues and a review. Int. J. Internet Sci. 1(1), 58–70 (2006)Google Scholar
  15. Kapelner, A., Chandler, D.: Preventing satisficing in online surveys: a “Kapcha” to ensure higher quality data”. In: Proceedings of Crowd Conference 2010, San Francisco, CA. http://www.danachandler.com/files/kapcha.pdf (2010). 04 Oct 2010
  16. Krosnick, J.A.: Response strategies for coping with the cognitive demands of attitude measures in surveys. Appl. Cogn. Psychol. 5, 213–236 (1991). doi: 10.1002/acp.2350050305 CrossRefGoogle Scholar
  17. Matthijsse, S., Leo, E., Hox, J.: Professional respondents in online panels: a threat to data quality? In: Proceedings of ESOMAR Panel Research, Orlando, FL (2006)Google Scholar
  18. Muñoz-Leiva, F., Sánchez-Fernández, J., Montoro-Ríos, F., Ibánez-Zapata, J.A.: Improving the response rate and quality in web-based surveys through the personalisation and frequency of reminder mailings. Qual. Quant. 44, 1037–1052 (2010)CrossRefGoogle Scholar
  19. Oppenheimer, D.M., Meyvis, T., Davidenko, N.: Instructional manipulation checks: detecting satisficing to increase statistical power. J. Exp. Soc. Psychol. 45, 867–872 (2009). doi: 10.1016/j.jesp.2009.03.009 CrossRefGoogle Scholar
  20. Reips, U.-D.: The Web Experiment Method: Advantages, Disadvantages, and Solutions. In: Birnbaum, M. H. (ed.) Psychology Experiments on the Internet, pp. 89–117. Academic Press, San Diego (2000). https://www.casra.ch/fileadmin/files/Media/About_us/Team/Teaching/2001_Summer/UZH_1132/Course/Reips2000.pdf
  21. Reips, U.-D.: Internet experiments: methods, guidelines, meta-data. Hum. Vision Electron. Imaging XIV Proc. SPIE 7240, 724008 (2009)CrossRefGoogle Scholar
  22. Revilla, M., Ochoa, C.: What are the links in a web survey among response time, quality, and auto-evaluation of the efforts done? Social Science Computer Review 33(1), 97–114 (2015). doi: 10.1177/0894439314531214. (First published online on 14 May 2014) CrossRefGoogle Scholar
  23. Saris, W.E., Gallhofer, I.: Design, Evaluation, and Analysis of Questionnaires for Survey Research. Wiley, New York (2007)CrossRefGoogle Scholar
  24. Toepoel, V., Das, M., Van Soest, A.: Effects of design in web surveys: comparing trained and fresh respondents. Public Opin. Q. 72, 985–1007 (2008)CrossRefGoogle Scholar
  25. Tourangeau, R., Rips, L.J., Rasinski, K.: The Psychology of Survey Response. Cambridge University Press, Cambridge (2000)CrossRefGoogle Scholar
  26. Van Selm, M., Jankowski, N.W.: Conducting online surveys. Qual. Quant. 40(3), 435–456 (2006)CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2015

Authors and Affiliations

  1. 1.RECSMUniversitat Pompeu FabraBarcelonaSpain

Personalised recommendations