Advertisement

Evidence-Based Survey Operations: Choosing and Mixing Modes

  • Michael Bosnjak
Chapter

Abstract

Evidence-based survey operations (EBSO) is an approach to the design, preparation, implementation, and post-processing of survey-based projects, requiring that all survey operations-related decisions, such as mode choice, the use of incentives, and fieldwork strategies, should be based on the best available evidence available in the area of survey methodology. The overall aim of this chapter is to (1) briefly describe the historical roots, mindset, and rationale of EBSO, and (2) to describe how EBSO can help in identifying promising solutions in decision call situations emerging while designing, preparing, implementing, and post-processing a survey-based project. Finally, (3) the value of EBSO in designing survey guidelines and policies will be illustrated.

References and Further Reading

  1. Blom, A.G., Bosnjak, M., Cornilleau, A., Cousteaux, A.-S., Das, M., Douhou, S., & Krieger, U. (2016). A comparison of four probability-based online and mixed-mode panels in Europe. Social Science Computer Review, 34(1), 8–25.CrossRefGoogle Scholar
  2. Borenstein, M., Hedges, L.V., Higgins, J.P.T, & Rothstein, H.R. (2009). Introduction to Meta-Analysis. Wiley, Chichester, UK.CrossRefGoogle Scholar
  3. Bosnjak, M., & Danner, D. (2015). Survey participation and response. Psihologija, 48(4), 307–310.CrossRefGoogle Scholar
  4. Bosnjak, M., Das, M., & Lynn, P. (2016). Methods for probability-based online and mixed-mode panels: Recent trends and future perspectives. Social Science Computer Review, 34(1), 3–7.CrossRefGoogle Scholar
  5. De Leeuw, E. D., & Van Der Zouwen, J. (1988). Data quality in face to face interviews: A comparative meta-analysis. Telephone survey methodology. Russell Sage Foundation, New York.Google Scholar
  6. De Leeuw, E.D. (1992). Data Quality in Mail, Telephone and Face to Face Surveys. TT Publikaties, Plantage Daklaan 40, 1018CN Amsterdam.Google Scholar
  7. De Leeuw, E. D. (2005). To mix or not to mix data collection modes in surveys. Journal of Official Statistics, 21(2), 233–255.Google Scholar
  8. Dillman, D.A. & Messer, B.L. (2010). Mixed-mode survey. In J.D. Wright & P.V. Marsden (Eds.), Handbook of Survey Research (2nd edition) (pp. 551–574). San Diego, CA: Elsevier.Google Scholar
  9. Dillman, D.A., Smyth, J.D., & Christian, L.M. (2014). Internet, phone, mail, and mixed-mode surveys: The tailored design method. John Wiley & Sons.Google Scholar
  10. Dodou, D., & De Winter, J.C.F. (2014). Social desirability is the same in offline, online, and paper surveys: A meta-analysis. Computers in Human Behavior, 36, 487–495.CrossRefGoogle Scholar
  11. Dwight, S.A., & Feigelson, M.E. (2000). A quantitative review of the effect of computerized testing on the measurement of social desirability. Educational and Psychological Measurement, 60(3), 340–360.CrossRefGoogle Scholar
  12. Dybä, T., Kitchenham, B.A., & Jorgensen, M. (2005). Evidence-based software engineering for practitioners. Software, IEEE, 22(1), 58–65.CrossRefGoogle Scholar
  13. Farrington, D.P., MacKenzie, D.L., Sherman, L.W., & Welsh, B.C. (Eds.). (2003). Evidence-based crime prevention. Routledge.Google Scholar
  14. Gnambs, T., & Kaspar, K. (2015). Disclosure of sensitive behaviors across self-administered survey modes: A meta-analysis. Behavior Research Methods, 47(4), 1237–1259.CrossRefGoogle Scholar
  15. Gnambs, T., & Kaspar, K. (2016). Socially desirable responding in web-based questionnaires a meta-analytic review of the candor hypothesis. Assessment, Online first: http://dx.doi.org/10.1177/1073191115624547.
  16. Goyder, J. (1987). The silent minority: Nonrespondents on sample surveys. Westview Press.Google Scholar
  17. Groves, R. M., & Peytcheva, E. (2008). The impact of nonresponse rates on nonresponse bias. A meta-analysis. Public Opinion Quarterly, 72(2), 167–189.CrossRefGoogle Scholar
  18. Groves, R.M., Fowler Jr, F.J., Couper, M.P., Lepkowski, J.M., Singer, E., & Tourangeau, R. (2011). Survey methodology (Vol. 561). John Wiley & Sons.Google Scholar
  19. Hox, J. J., & De Leeuw, E. D. (1994). A comparison of nonresponse in mail. telephone, and face-to-face surveys. Quality and Quantity, 28(4), 329–344.Google Scholar
  20. Lozar Manfreda, K. L., Bosnjak, M., Berzelak, J., Haas, I., Vehovar, V., & Berzelak, N. (2008). Web surveys versus other survey modes: A meta-analysis comparing response rates. Journal of the Market Research Society, 50(1), 79.Google Scholar
  21. Mavletova, A. & Couper, M.P. (2015). A meta-analysis of breakoff rates in mobile web surveys. In D. Toninelli, R. Pinter & P. de Pedraza (Eds.), Mobile research methods. Opportunities and challenges of mobile research methodologies (pp. 81–98). Ubiquity Press, London, UK.Google Scholar
  22. Medway, R. L., & Fulton, J. (2012). When more gets you less: a meta-analysis of the effect of concurrent web options on mail survey response rates. Public Opinion Quarterly, 76(4), 733–746.CrossRefGoogle Scholar
  23. Mercer, A., Caporaso, A., Cantor, D., & Townsend, R. (2015). How much gets you how much? Monetary incentives and response rates in household surveys. Public Opinion Quarterly, 79(1), 105–129.CrossRefGoogle Scholar
  24. Pring, R., & Thomas, G. (2004). Evidence-based practice in education. McGraw- Hill Education, UK.Google Scholar
  25. Richman, W.L., Kiesler, S., Weisband, S., & Drasgow, F. (1999). A meta-analytic study of social desirability distortion in computer-administered questionnaires, traditional questionnaires, and interviews. Journal of Applied Psychology, 84(5), 754–775.CrossRefGoogle Scholar
  26. Rousseau, D.M. (2012). The Oxford handbook of evidence-based management. Oxford University Press.Google Scholar
  27. Sackett, D. L., Rosenberg, W. M., Gray, J. M., Haynes, R. B., & Richardson, W. S. (1996). Evidence based medicine: What it is and what it isn’t. British Medical Journal, 312(7023), 71–72.CrossRefGoogle Scholar
  28. Shih, T.H., & Fan, X. (2007). Response rates and mode preferences in web-mail mixed-mode surveys: A meta-analysis. International Journal of Internet Science, 2(1), 59–82.Google Scholar
  29. Tourangeau, R., & Yan, T. (2007). Sensitive questions in surveys. Psychological Bulletin, 133(5), 859–883.CrossRefGoogle Scholar
  30. Weisband, S., & Kiesler, S. (1996). Self disclosure on computer forms: Meta-analysis and implications. Proceedings of the ACM CHI 96 Human Factors in Computing Systems Conference April 14–18, 1996, Vancouver, Canada, pp. 3–10. http://www.sigchi.org/chi96/proceedings/papers/Weisband/sw_txt.htm
  31. Ye, C., Fulton, J., & Tourangeau, R. (2011). More positive or more extreme? A meta-analysis of mode differences in response choice. Public Opinion Quarterly, 75(2), 349–365.CrossRefGoogle Scholar

Copyright information

© The Author(s) 2018

Authors and Affiliations

  1. 1.ZPID – Leibniz Institute for Psychology InformationTrierGermany

Personalised recommendations