Response Rates in Organizational Science, 1995–2008: A Meta-analytic Review and Guidelines for Survey Researchers
- 3.1k Downloads
This study expands upon existing knowledge of response rates by conducting a large-scale quantitative review of published response rates. This allowed a fine-grained comparison of response rates across respondent groups. Other unique features of this study are the analysis of response enhancing techniques across respondent groups and response rate trends over time. In order to aid researchers in designing surveys, we provide expected response rate percentiles for different survey modalities.
We analyzed 2,037 surveys, covering 1,251,651 individual respondents, published in 12 journals in I/O Psychology, Management, and Marketing during the period 1995–2008. Expected response rate levels were summarized for different types of respondents and use of response enhancing techniques was coded for each study.
First, differences in mean response rate were found across respondent types with the lowest response rates reported for executive respondents and the highest for non-working respondents and non-managerial employees. Second, moderator analyses suggested that the effectiveness of response enhancing techniques was dependent on type of respondents. Evidence for differential prediction across respondent type was found for incentives, salience, identification numbers, sponsorship, and administration mode. When controlling for increased use of response enhancing techniques, a small decline in response rates over time was found.
Our findings suggest that existing guidelines for designing effective survey research may not always offer the most accurate information available. Survey researchers should be aware that they may obtain lower/higher response rates depending on the respondent type surveyed and that some response enhancing techniques may be less/more effective in specific samples.
This study, analyzing the largest set of published response rates to date, offers the first evidence for different response rates and differential functioning of response enhancing techniques across respondent types.
KeywordsResponse rate Response enhancing technique Survey Respondent type Sample Meta-analysis
We would like to thank Katrien Vermeulen, Liesbet De Koster, Valerie Boulangier, Claire Hemelaer, Sophie Pczycki, Myrjam Van de Vijver, and Bernd Carette for their help in coding the studies.
- Aguinis, H. (2004). Regression analysis for categorical moderators. New York: Guilford Press.Google Scholar
- Baruch, Y. (1999). Response rate in academic studies: A comparative analysis. Human Relations, 52, 421–438.Google Scholar
- Cohen, J., Cohen, P., West, S. G., & Aiken, L. S. (2004). Applied multiple regression/correlation analysis for the behavioral sciences (3rd ed.). Hillsdale, NJ: Erlbaum.Google Scholar
- Dillman, D. A. (1978). Mail and telephone surveys. New York: Wiley.Google Scholar
- Dillman, D. A. (2000). Mail and Internet surveys: The tailored design method. New York: Wiley.Google Scholar
- Gendall, P., Hoek, J., & Brennan, M. (1998). The tea bag experiment: More evidence on incentives in mail surveys. Journal of the Market Research Society, 40, 347–351.Google Scholar
- Hopkins, K. D., & Gullickson, A. R. (1993). Response rates in survey research: A meta-analysis of the effects of monetary gratuities. Journal of Experimental Education, 61, 52–62.Google Scholar
- McKee, D. O. (1992). The effect of using a questionnaire identification code and message about non-response follow-up plans on mail survey response characteristics. Journal of the Marker Research Society, 34, 179–191.Google Scholar
- Paxson, M. C. (1995). Increasing survey response rates: Practical instructions from the total design method. Cornell Hotel and Restaurant Administration Quarterly, 36, 66–73.Google Scholar
- Van den Berg, G. J., Lindeboom, M., & Dolton, P. J. (2006). Survey non-response and the duration of unemployment. Journal of the Royal Statistical Society Series A, 169, 585–604.Google Scholar