Skip to main content

Towards Survey Response Rate Theories That No Longer Pass Each Other Like Strangers in the Night

  • Chapter
  • First Online:
Understanding Survey Methodology

Part of the book series: Frontiers in Sociology and Social Research ((FSSR,volume 4))

Abstract

Declines in survey response rates being observed throughout the world are due in part to the use of data collection designs that ignore how the many elements of a particular design, including decisions on the joint use of survey modes interact to affect rates of response. Although several theories of response behavior have been proposed as means of improving response rates, they tend to be dated and emphasize particular techniques for improving response while ignoring others. In addition, they are often limited to single-mode applications. Consequently, they provide little more than abstract advice, while ignoring how the theory should specifically guide the design of each survey contact and any materials associated with those requests for response. In this paper I propose the need to develop comprehensive data collection designs (from individual communications to the questionnaires and any supporting materials) guided by theory that has been shown effective in explaining human behavior. I also propose moving away from individual tests of response-inducing techniques that has tended to dominate response rate research to the creation and testing of comprehensive designs that explicitly use theory to guide the development of design details.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 16.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 139.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Bailey, P., Pritchard, G., & Kemohan, H. (2015). Gamification in market research: Increasing enjoyment, participant engagement and richness of data, but what of data validity? International Journal of Market Research, 57(1), 17–28.

    Article  Google Scholar 

  • Blau, P. (1964). Exchange and power in social life. New York: Wiley.

    Google Scholar 

  • Brick, M. M., & Williams, D. (2012). Explaining rising nonresponse rates in cross-sectional surveys. The Annals of the American Academy of Political and Social Science, 645, 36–59.

    Article  Google Scholar 

  • Cialdini, R. (1984). Influence: The new psychology of modern persuasion. New York: Quill.

    Google Scholar 

  • Cialdini, R. (2016). Pre-Susasion: A revolutionary way to influence and persuade. New York: Simon and Schuster.

    Google Scholar 

  • Comley, P. (2006). The games we play: A psychoanalysis of the relationship between panel owners and panel participants. In Proceedings from the ESOMAR world research conference. Panel research 2006 (Vol. 317, pp. 123–132). Amsterdam: ESOMAR.

    Google Scholar 

  • Dillman, D. A. (1978). Mail and telephone surveys: The total design method. New York, NY: Wiley.

    Google Scholar 

  • Dillman, D. A. (2000). Mail and internet surveys: The tailored design method (2nd ed.). New York, NY: Wiley.

    Google Scholar 

  • Dillman, D. A. (2017). The promise and challenge of pushing respondents to the web in mixed-mode surveys. Survey Methodology, Statistics Canada, 43(1), 3–30.

    Google Scholar 

  • Dillman, D. A., Smyth, J. D., & Christian, L. M. (2014). Internet, phone, mail and mixed-mode surveys; The tailored design method (4th ed.). Hoboken, NJ: Wiley.

    Google Scholar 

  • Dillman, D. A., Hao, F., & Millar, M. M. (2016). Improving the effectiveness of online data collection by mixing survey modes, Chapter 13. In N. Fielding, R. M. Lee, & G. Blank (Eds.), The Sage handbook of online research methods (2nd ed.). London: Sage.

    Google Scholar 

  • Dutwin, D., & Lavrakas, P. (2012). Trends in telephone outcomes, 2008-2015. Survey Practice, 9(3). https://doi.org/10.29115/SP-2016-0017.

  • Festinger, L. (1957). A theory of cognitive dissonance. Stanford, CA: Stanford University Press.

    Book  Google Scholar 

  • Fishbein, M., & Ajzen, I. (1980). Predicting and changing behavior: The reasoned action approach. New York: Taylor and Francis.

    Google Scholar 

  • Greenberg, P., & Dillman, D. A. (2017). Mail communications and survey response: A test of social exchange vs. pre-suasion theory for improving response rates and data quality. Paper presented at American Association for Public Opinion Research Annual Conference, New Orleans, LA, May 18th.

    Google Scholar 

  • Groves, R. M. (2006). Nonresponse rates and nonresponse bias in household surveys. Public Opinion Quarterly, 70(5), 646–675.

    Article  Google Scholar 

  • Groves, R. M., & Couper, M. P. (1998). Nonresponse in household interview surveys. New York, NY: Wiley.

    Book  Google Scholar 

  • Groves, R. M., & Peytcheva, E. (2008). The impact of nonresponse rates on nonresponse bias. Public Opinion Quarterly, 72(2), 167–189.

    Article  Google Scholar 

  • Groves, R. M., Cialdini, R. B., & Couper, M. P. (1992). Understanding the decision to participate in a survey. Public Opinion Quarterly, 56, 475–495.

    Article  Google Scholar 

  • Groves, R. M., Singer, E., & Corning, A. (2000). Leverage-saliency theory of survey participation. Public Opinion Quarterly, 64, 299–308.

    Article  Google Scholar 

  • Heberlein, T. A., & Baumgartner, R. (1978). Factors affecting response rates to mailed questionnaires: A quantitative analysis of the published literature. American Sociological Review, 43, 447–462.

    Article  Google Scholar 

  • Homans, G. (1961). Social behavior: Its elementary forms. New York, NY: Harcourt, Brace and World.

    Google Scholar 

  • Hox, J., De Leeuw, E., & Vorst, H. (1996). A reasoned action explanation for survey nonresponse. In S. Laaksonen (Ed.), International perspectives on nonresponse (pp. 101–110). Helsinki: Statistics Finland.

    Google Scholar 

  • Jans, M., & Levenstein, R. (2015). Rethinking leverage-salience theory and causes of survey nonresponse: Integrating emotion, mood, and affect into theory of nonresponse.

    Google Scholar 

  • Kelman, H. C. (1953). Compliance, identification and internalization: Three processes of attitude change. Human Relations, 6, 185–214.

    Article  Google Scholar 

  • Kennedy, C., & Hartwig, H. Response rates in telephone surveys have resumed their decline. Pew Research Center. Accessed March 30, 2019., from https://www.pewresearch.org/fact-tank/2019/02/27/response-rates-in-telephone-surveys-have-resumed-their-decline/

  • Keusch, F. (2015). Why do people participate in Web surveys? Applying survey participation theory to Internet survey data collection. Management Review Quarterly, 65, 183–216.

    Article  Google Scholar 

  • Mavletova, D. (2015). A gamification effect in longitudinal web surveys among children and adolescents. International Journal of Market Research., 57(3), 413–438.

    Article  Google Scholar 

  • Messer, B. L., & Dillman, D. A. (2011). Surveying the general public over the internet using address-based sampling and mail contact procedures. Public Opinion Quarterly, 75(3), 429–457.

    Article  Google Scholar 

  • Millar, M. M. (2013). Determining whether research is interdisciplinary: An analysis of new indicators (Technical Report No. 13-049). Pullman, Washington State University, Social and Economic Sciences Research Center.

    Google Scholar 

  • Millar, M. M., & Dillman, D. A. (2011). Improving response to web and mixed-mode surveys. Public Opinion Quarterly, 75(2), 249–269.

    Article  Google Scholar 

  • National Research Council. (2014). Measuring what we spend: Toward a new Consumer Expenditure Survey. In: D. A. Dillman, & C. C. House (Eds.), Panel on redesigning the BLS Consumer Expenditure Surveys, Committee on National Statistics, Division of Behavioral Social Sciences and Education. Washington, DC.

    Google Scholar 

  • Olson, K., Lepkowski, J. M., & Garabrant, D. (2011). An experimental evaluation of the content of persuasion letters on response rates and survey estimates in a nonresponse follow-up study. Survey Research Methods, 5(1), 21–26. http://digitalcommons.unl.edu/sociologyfacpub/141/.

    Google Scholar 

  • Olson, K., Smyth, J. D., & Wood, H. M. (2012). Does giving people their preferred survey mode actually increase survey participation rates? Public Opinion Quarterly, 76(4), 611–635.

    Article  Google Scholar 

  • Origgi, G. (2018). Reputation, what it is and why it matters. Princeton, NJ: Princeton University Press.

    Book  Google Scholar 

  • Puleston, J. (2012). Gamification 101—From theory to practice—Part I. Quirk’s Marketing Research Review (article 20120126-1).

    Google Scholar 

  • Schreiner, J. P. (2019, November). Improving web-push respondent communication in the American Community Survey. Unpublished PhD dissertation.

    Google Scholar 

  • Scott, C. (1961). Research on mail surveys. Journal of the Royal Statistical Society, 124, 143–205.

    Article  Google Scholar 

  • Singer, E. (2011). Toward a benefit-cost theory of survey participation: Evidence, further tests, and implications. Journal of Official Statistics, 27(2), 379–392.

    Google Scholar 

  • Singer, E., & Ye, C. (2013). The use and effects of incentives in surveys. Annals of the American Academy of Political and Social Science., 645(1), 112–141.

    Article  Google Scholar 

  • Smyth, J. D., Dillman, D. A., Christian, L. M., & O’Neill, A. (2010). Using the Internet to survey small towns and communities: Limitations and possibilities in the early 21st century. American Behavioral Scientist, 53, 1423–1448.

    Article  Google Scholar 

  • Williams, D., & Brick, J. M. (2018). Trends in U.S. face-to-face household survey nonresponse and level of effort. Journal of Survey Statistics and Methodology, 6, 186–211.

    Article  Google Scholar 

Download references

Acknowledgements

Support for writing this paper was provided by the Social and Economic Sciences Research Center (SESRC) in the Washington State University Office of Research at Washington State University (WSU), and the WSU College of Agriculture Human and Natural Resources (CAHNRS) under USDA Hatch Project 410 and by the USDA Multistate Research Coordinating Committee and Information Exchange Group, WERA 1010: Improving Data Quality from Sample Surveys to foster Agricultural, Community and Development in Rural America. The opinions expressed in this paper are my own, but I wish to acknowledge with thanks the helpful reviews and suggestions received from Glenn Israel, Virginia Lesser, Kenneth Wallen and other members of the WERA 1010 Committee.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Don A. Dillman .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Dillman, D.A. (2020). Towards Survey Response Rate Theories That No Longer Pass Each Other Like Strangers in the Night. In: Brenner, P.S. (eds) Understanding Survey Methodology. Frontiers in Sociology and Social Research, vol 4. Springer, Cham. https://doi.org/10.1007/978-3-030-47256-6_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-47256-6_2

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-47255-9

  • Online ISBN: 978-3-030-47256-6

  • eBook Packages: Social SciencesSocial Sciences (R0)

Publish with us

Policies and ethics