Journal of Business and Psychology

, Volume 25, Issue 3, pp 335–349 | Cite as

Response Rates in Organizational Science, 1995–2008: A Meta-analytic Review and Guidelines for Survey Researchers

  • Frederik Anseel
  • Filip Lievens
  • Eveline Schollaert
  • Beata Choragwicka



This study expands upon existing knowledge of response rates by conducting a large-scale quantitative review of published response rates. This allowed a fine-grained comparison of response rates across respondent groups. Other unique features of this study are the analysis of response enhancing techniques across respondent groups and response rate trends over time. In order to aid researchers in designing surveys, we provide expected response rate percentiles for different survey modalities.


We analyzed 2,037 surveys, covering 1,251,651 individual respondents, published in 12 journals in I/O Psychology, Management, and Marketing during the period 1995–2008. Expected response rate levels were summarized for different types of respondents and use of response enhancing techniques was coded for each study.


First, differences in mean response rate were found across respondent types with the lowest response rates reported for executive respondents and the highest for non-working respondents and non-managerial employees. Second, moderator analyses suggested that the effectiveness of response enhancing techniques was dependent on type of respondents. Evidence for differential prediction across respondent type was found for incentives, salience, identification numbers, sponsorship, and administration mode. When controlling for increased use of response enhancing techniques, a small decline in response rates over time was found.


Our findings suggest that existing guidelines for designing effective survey research may not always offer the most accurate information available. Survey researchers should be aware that they may obtain lower/higher response rates depending on the respondent type surveyed and that some response enhancing techniques may be less/more effective in specific samples.


This study, analyzing the largest set of published response rates to date, offers the first evidence for different response rates and differential functioning of response enhancing techniques across respondent types.


Response rate Response enhancing technique Survey Respondent type Sample Meta-analysis 



We would like to thank Katrien Vermeulen, Liesbet De Koster, Valerie Boulangier, Claire Hemelaer, Sophie Pczycki, Myrjam Van de Vijver, and Bernd Carette for their help in coding the studies.


  1. Aguinis, H. (2004). Regression analysis for categorical moderators. New York: Guilford Press.Google Scholar
  2. Allen, C., Schewe, C. D., & Wijk, G. (1980). More on self-perception theory’s foot technique in the pre call/mail survey setting. Journal of Marketing Research, 17, 498–502.CrossRefGoogle Scholar
  3. Anseel, F., Duyck, W., De Baene, W., & Brysbaert, M. (2004). Journal impact factors and self-citations: Implications for psychology journals. American Psychologist, 59, 49–51.CrossRefPubMedGoogle Scholar
  4. Applebaum, H. (1986). The anthropology of work in industrial society. Anthropology of Work Review, 7, 25–32.CrossRefGoogle Scholar
  5. Baruch, Y. (1999). Response rate in academic studies: A comparative analysis. Human Relations, 52, 421–438.Google Scholar
  6. Baruch, Y., & Holtom, B. C. (2008). Survey response rate levels and trends in organizational research. Human Relations, 61, 1139–1160.CrossRefGoogle Scholar
  7. Bonaccorsi, A., & Piccaluga, A. (1994). A theoretical framework for the evaluation of university–industry relationships. R & D Management, 24, 229–247.CrossRefGoogle Scholar
  8. Bruvold, N. T., Comer, J. M., & Rospert, A. M. (1990). Interactive effects of major response facilitators. Decision Sciences, 21, 551–562.CrossRefGoogle Scholar
  9. Church, A. H. (1993). Estimating the effect of incentives on mail survey response rates: A meta-analysis. Public Opinion Quarterly, 57, 62–79.CrossRefGoogle Scholar
  10. Cohen, J., Cohen, P., West, S. G., & Aiken, L. S. (2004). Applied multiple regression/correlation analysis for the behavioral sciences (3rd ed.). Hillsdale, NJ: Erlbaum.Google Scholar
  11. Converse, P. R., Wolfe, E. W., Huang, X. T., & Oswald, F. L. (2008). Response rates for mixed-mode surveys using mail and e-mail/Web. American Journal of Evaluation, 29, 99–107.CrossRefGoogle Scholar
  12. Creed, P. A., & Klisch, J. (2005). Future outlook and financial strain: Testing the personal agency and latent deprivation models of unemployment and well-being. Journal of Occupational Health Psychology, 10, 251–260.CrossRefPubMedGoogle Scholar
  13. Cycyota, C. S., & Harrison, D. A. (2002). Enhancing survey response rates at the executive level: Are employee- or consumer-level techniques effective? Journal of Management, 28, 151–176.CrossRefGoogle Scholar
  14. Cycyota, C. S., & Harrison, D. A. (2006). What (not) to expect when surveying executives: A meta-analysis of top manager response rates and techniques over time. Organizational Research Methods, 9, 133–160.CrossRefGoogle Scholar
  15. Deci, E. L., Koestner, R., & Ryan, R. M. (1999). A meta-analytic review of experiments examining the effects of extrinsic rewards on intrinsic motivation. Psychological Bulletin, 125, 627–668.CrossRefPubMedGoogle Scholar
  16. Dillman, D. A. (1978). Mail and telephone surveys. New York: Wiley.Google Scholar
  17. Dillman, D. A. (2000). Mail and Internet surveys: The tailored design method. New York: Wiley.Google Scholar
  18. Dillman, D. A. (2002). Navigating the rapids of change: Some observations on survey methodology in the early twenty-first century. Public Opinion Quarterly, 66, 473–494.CrossRefGoogle Scholar
  19. Edwards, P., Roberts, I., Clarke, M., DiGuiseppi, C., Pratap, C., Wentz, R., et al. (2002). Increasing response rates to postal questionnaires: Systematic review. British Medical Journal, 324, 1–9.CrossRefGoogle Scholar
  20. Fox, R. J., Crask, M. R., & Kim, J. (1988). Mail survey response rate: A meta-analysis of selected techniques for inducing response. Public Opinion Quarterly, 52, 467–491.CrossRefGoogle Scholar
  21. Gendall, P., Hoek, J., & Brennan, M. (1998). The tea bag experiment: More evidence on incentives in mail surveys. Journal of the Market Research Society, 40, 347–351.Google Scholar
  22. Groves, R. M., Presser, S., & Dipko, S. (2004). The role of topic interest in survey participation decisions. Public Opinion Quarterly, 68, 2–31.CrossRefGoogle Scholar
  23. Groves, R. M., Singer, E., & Corning, A. (2000). Leverage-salience theory of survey participation: Description and an illustration. Public Opinion Quarterly, 64, 299–308.CrossRefPubMedGoogle Scholar
  24. Heberlein, T. A., & Baumgartner, R. (1978). Factors affecting response rates to mailed questionnaires: A quantitative analysis of the published literature. American Sociological Review, 43, 447–462.CrossRefGoogle Scholar
  25. Hopkins, K. D., & Gullickson, A. R. (1993). Response rates in survey research: A meta-analysis of the effects of monetary gratuities. Journal of Experimental Education, 61, 52–62.Google Scholar
  26. McKee, D. O. (1992). The effect of using a questionnaire identification code and message about non-response follow-up plans on mail survey response characteristics. Journal of the Marker Research Society, 34, 179–191.Google Scholar
  27. Paxson, M. C. (1995). Increasing survey response rates: Practical instructions from the total design method. Cornell Hotel and Restaurant Administration Quarterly, 36, 66–73.Google Scholar
  28. Rogelberg, S. G., Conway, J. M., Sederburg, M. E., Spitzmuller, C., Aziz, S., & Knight, W. E. (2003). Profiling active and passive nonrespondents to an organizational survey. Journal of Applied Psychology, 88, 1104–1114.CrossRefPubMedGoogle Scholar
  29. Rogelberg, S. G., & Stanton, J. M. (2007). Understanding and dealing with organizational survey nonresponse: Introduction. Organizational Research Methods, 10, 195–209.CrossRefGoogle Scholar
  30. Roth, P. L., & BeVier, C. A. (1998). Response rates in HRM/OB survey research: Norms and correlates, 1990–1994. Journal of Management, 24, 97–117.CrossRefGoogle Scholar
  31. Shih, T. H., & Fan, X. T. (2008). Comparing response rates from Web and mail surveys: A meta-analysis. Field Methods, 20, 249–271.CrossRefGoogle Scholar
  32. Sills, S. J., & Song, C. Y. (2002). Innovations in survey research: An application of Web-based surveys. Social Science Computer Review, 20, 22–30.CrossRefGoogle Scholar
  33. Spitzmuller, C., Glenn, D. M., Barr, C. D., Rogelberg, S. G., & Daniel, P. (2006). “If you treat me right, I reciprocate”: Examining the role of exchange in organizational survey–response. Journal of Organizational Behavior, 27, 19–35.CrossRefGoogle Scholar
  34. Trussel, N., & Lavrakas, P. J. (2004). The influence of incremental increases in token cash incentives on mail survey response—Is there an optimal amount? Public Opinion Public Opinion Quarterly, 68, 349–367.CrossRefGoogle Scholar
  35. Van den Berg, G. J., Lindeboom, M., & Dolton, P. J. (2006). Survey non-response and the duration of unemployment. Journal of the Royal Statistical Society Series A, 169, 585–604.Google Scholar
  36. Yammarino, F. J., Skinner, S. J., & Childers, T. L. (1991). Understanding mail survey response behavior. Public Opinion Quarterly, 55, 613–629.CrossRefGoogle Scholar
  37. Yu, J., & Cooper, H. (1983). A quantitative review of research design effects on response rates to questionnaires. Journal of Marketing Research, 20, 36–44.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2010

Authors and Affiliations

  • Frederik Anseel
    • 1
  • Filip Lievens
    • 1
  • Eveline Schollaert
    • 1
  • Beata Choragwicka
    • 2
  1. 1.Department of Personnel Management and Work and Organizational PsychologyGhent UniversityGhentBelgium
  2. 2.Department of Social PsychologyUniversity of Santiago de CompostelaSantiago de Compostela, GaliciaSpain

Personalised recommendations