Advertisement

Management Review Quarterly

, Volume 65, Issue 3, pp 183–216 | Cite as

Why do people participate in Web surveys? Applying survey participation theory to Internet survey data collection

  • Florian Keusch
State-of-the-Art

Abstract

In recent years Web surveys have emerged as the most popular mode of primary data collection in market and social research. To improve our understanding about the influence of different societal-level factors, characteristics of the sample person, and attributes of the survey design on participation in Web surveys, this paper establishes a systematic link between theoretical frameworks used to explain survey participation behavior and state-of-the-art empirical research on online data collection methods. The concepts of self-perception, cognitive dissonance, commitment and involvement, social exchange, compliance, leverage-salience, and planned behavior are discussed and their relationship with factors that have empirically proven to influence Web survey participation are analyzed using data from an expert survey. This paper will help researchers and practitioners to make informed decisions about the use of techniques increasing participation in Web surveys.

Keywords

Empirical market and social research Survey methodology  Web survey participation behavior Self-perception theory Cognitive dissonance theory Commitment Involvement Social exchange theory  Compliance heuristics Leverage-salience theory Theory of planned behavior Expert survey Principal component analysis 

JEL Classification

C380 C830 M310 

Notes

Acknowledgments

The author thanks the editor and two anonymous reviewers as well as Eleanor Singer and Chris Antoun for helpful comments on earlier drafts of this paper and Chris Antoun and Chan Zhang for their feedback on the expert survey.

Supplementary material

11301_2014_111_MOESM1_ESM.pdf (246 kb)
Supplementary material 1 (pdf 247 KB)

References

  1. Ajzen I (1987) Attitudes, trends, and actions: dispositional prediction of behavior in personality and social psychology. Adv Exp Soc Psychol 20:1–63Google Scholar
  2. Ajzen I (1991) The theory of planned behavior. Organ Behav Hum Decis Process 50:179–211Google Scholar
  3. Albaum G, Smith SM (2012) Why people agree to participate in surveys. In: Gideon L (ed) Handbook of survey methodology for the social sciences. Springer, New York, pp 179–193Google Scholar
  4. Albaum GS, Evangelista F, Medina L (1998) Role of response behavior theory in survey research: a cross-national study. J Bus Res 42:115–125Google Scholar
  5. Allen CT, Schewe CD, Wijk G (1980) More on self-perception theory’s foot technique in the pre-call/mail survey setting. J Mark Res 17:498–502Google Scholar
  6. Bandilla W, Couper MP, Kaczmirek L (2012) The mode of invitation for Web surveys. Surv Pract 5(3). http://surveypractice.org/index.php/SurveyPractice/article/view/20/html. Accessed 8 May 2014
  7. Batinic B (2002) Online-Marktforschung auf dem Prüfstand. In: Diller H (ed) Neue Entwicklungen in der Marktforschung. Gesellschaft für Innovatives Marketing e.V., Nürnberg, pp 77–95Google Scholar
  8. Batinic B, Moser K (2005) Determinanten der Rücklaufquote in Online-Panels. Z Medienpsychol 17:64–74Google Scholar
  9. Becker HS (1960) Notes on the concept of commitment. Am J Sociol 66:32–40Google Scholar
  10. Bem DJ (1972) Self-perception theory. Adv Exp Soc Psychol 6:1–62Google Scholar
  11. Bethlehem J, Stoop I (2007) Online panels—a paradigma theft? In: Trotman M, Burrell T, Gerrard L, Anderton K, Basi G, Couper M, Morris K, Birks D, Johnson AJ, Baker R, Rigg M, Taylor S, Westlake A (eds) ASC 2007. The challenges of a changing world. Proceedings of the 5th international conference of the association for survey computing. ASC, Berkeley, pp 113–131Google Scholar
  12. Birnholtz JP, Horn DB, Finholt TA, Bae SJ (2004) The effect of cash, electronic, and paper gift certificate as respondent incentives for a Web-based survey of technologically sophisticated respondents. Soc Sci Comput Rev 22:355–362Google Scholar
  13. Bosnjak M, Batinic B (1999) Determinanten der Teilnahmebereitschaft an Internet-basierten Fragebogenuntersuchungen am Beispiel E-Mail. In: Batinic B, Werner A, Gräf L, Bandilla W (eds) Online research. Methoden, Anwendungen und Ergebnisse. Hogrefe, Göttingen et al, pp 145–157Google Scholar
  14. Bosnjak M, Tuten TL (2001) Classifying response behaviors in web based surveys. J Comput Mediat Commun 6(3). http://jcmc.indiana.edu/vol6/issue3/boznjak.html. Accessed 12 April 2012
  15. Bosnjak M, Tuten TL (2003) Prepaid and promised incentives in Web surveys. An experiment. Soc Sci Comput Rev 21:208–217Google Scholar
  16. Bosnjak M, Tuten TL, Wittmann WW (2005) Unit (non)response in Web-based access panel surveys: an extended planned-behavior approach. Psychol Mark 22:489–505Google Scholar
  17. Bosnjak M, Neubarth W, Couper MP, Bandilla W, Kaczmirek L (2008) Prenotification in Web-based access panel surveys. The influence of mobile text messaging versus e-mail on response rates and sample composition. Soc Sci Comput Rev 26:213–223Google Scholar
  18. Boulianne S (2012) Examining the gender effects of different incentive amounts in a Web survey. Field Methods 25:91–104Google Scholar
  19. Boulianne S, Klofstad CA, Basson D (2011) Sponsor prominence and response patterns to an online survey. Int J Public Opin Res 23:79–87Google Scholar
  20. Brennan M, Hoek J (1992) The behavior of respondents, nonrespondents, and refusers across mail surveys. Public Opin Q 56:530–535Google Scholar
  21. Brüggen E, Dholakia UM (2010) Determinants of participation and response effort in Web panel surveys. J Interact Mark 24:239–250Google Scholar
  22. Brüggen E, Wetzels M, de Ruyter K, Schillewaert N (2011) Individual differences in motivation to participate in online panels. The effect on response rate and response quality perceptions. Int J Mark Res 53:369–390Google Scholar
  23. Bruvold NT, Comer JM, Rospert AM (1990) Interactive effects of major response facilitators. Decis Sci 21:551–562Google Scholar
  24. Burgesse C, Nicholas J, Gulliford M (2012) Impact of an electronic, computer-delivered questionnaire, with or without postal reminders, on survey response rate in primary care. J Epidemiol Community Health 66:663–664Google Scholar
  25. Busby DM, Yoshida K (2013) Challenges with online research for couples and families: evaluating nonrespondents and the differential impact of incentives. J Child Fam Stud. Advance online publication. doi: 10.1007/s10826-013-9863-6
  26. Cavusgil ST, Elvey-Kirk LA (1998) Mail survey response behaviour. A conceptualization of motivating factors and an empirical study. Eur J Mark 32:1165–1192Google Scholar
  27. Cho YI, Johnson TP, VanGeest JB (2013) Enhancing surveys of health care professionals: a meta-analysis of techniques to improve response. Eval Health Prof 36:382–407Google Scholar
  28. Church AH (1993) Estimating the effect of incentives on mail survey response rates: a meta-analysis. Public Opin Q 57:62–79Google Scholar
  29. Cialdini RB (2009) Influence. Science and practice, 5th edn. Pearson, BostonGoogle Scholar
  30. Cobanoglu C, Cobanoglu N (2003) The effect of incentives in Web surveys: application and ethical consideration. Int J Mark Res 45:475–488Google Scholar
  31. Comley P (2000) Pop-up surveys. What works, what doesn’t work and what will work in the future. In: Brooks R (ed) Mark research in a .com environment. Esomar monograph no. 10. ESOMAR, Amsterdam, pp 181–189Google Scholar
  32. Conrad FG, Couper MP, Tourangeau R, Peytchev A (2010) The impact of progress indicators on task completion. Interact Comput 22:417–427Google Scholar
  33. Cook C, Heath F, Thompson RL (2000) A meta-analysis of response rates in Web- or Internet-based surveys. Educ Psychol Meas 60:821–836Google Scholar
  34. Cooper H (2010) Research synthesis and meta-analysis. A step-by-step approach, 4th edn. Sage, Los AngelesGoogle Scholar
  35. Couper MP (2000) Web surveys. A review of issues and approaches. Public Opin Q 64:464–494Google Scholar
  36. Couper MP (2005) Technological trends in survey data collection. Soc Sci Comput Rev 23:486–501Google Scholar
  37. Couper MP, Trautgott MW, Lamias MJ (2001) Web survey design and administration. Public Opin Q 65:230–253Google Scholar
  38. Crawford SD, Couper MP, Lamias MJ (2001) Web surveys. Perception of burden. Soc Sci Comput Rev 19:146–162Google Scholar
  39. Curtin R, Presser S, Singer E (2005) Changes in telephone survey nonresponse over the past quarter century. Public Opin Q 69:87–98Google Scholar
  40. de Leeuw E, de Heer W (2002) Trends in household survey nonresponse: a longitudinal and international perspective. In: Groves RM, Dillman DA, Eltinge JL, Little RJA (eds) Survey nonresponse. Wiley, New York, pp 41–54Google Scholar
  41. Denissen JJA, Neumann L, van Zalk M (2010) How the Internet is changing the implementation of traditional research methods, people’s daily lives, and the way in which developmental scientists conduct research. Int J Behav Dev 34:564–575Google Scholar
  42. Deutskens E, de Ruyter K, Wetzels M, Oosterveld P (2004) Response rate and response quality of Internet-based surveys: an experimental study. Mark Lett 15:21–36Google Scholar
  43. Dillman DA (1978) Mail and telephone surveys. The total design method. Wiley, New YorkGoogle Scholar
  44. Dillman DA (2000) Mail and Internet surveys. The tailored design method, 2nd edn. Wiley, New YorkGoogle Scholar
  45. Dillman DA, Smyth JD, Christian LM (2009) Internet, mail, and mixed-mode surveys. The tailored design method, 3rd edn. Wiley, HobokenGoogle Scholar
  46. Doerflinger P, Kopec JA, Liang MH, Esdaile JM (2010) The effect of cash lottery on response rates to an online health survey among members of the Canadian Association of Retired Persons: a randomized experiment. Can J Public Health 101:251–254Google Scholar
  47. Dykema J, Stevenson J, Day B, Sellers SL, Bonham VL (2011) Effects of incentives and prenotification on response rates and costs in a national Web survey of physicians. Eval Health Prof 34:434–447Google Scholar
  48. Dykema J, Stevenson J, Klein L, Kim Y, Day B (2012) Effects of e-mailed versus mailed invitations and incentives on response rates, data quality, and costs in a Web survey of university faculty. Soc Sci Comput Rev 31:359–370Google Scholar
  49. Eagly AH, Chaiken S (1984) Cognitive theories of persuasion. Adv Exp Soc Psychol 17:267–296Google Scholar
  50. Edwards P, Roberts I, Clarke M, DiGuiseppi C, Partap S, Wentz R, Kwan I (2002) Increasing response rates to postal questionnaires: systematic review. Brit Med J 324(7347):1183–1191Google Scholar
  51. Edwards P, Roberts I, Clarke M, DiGuiseppi C, Wentz R, Kwan I, Cooper R, Felix LM, Pratap S (2009) Methods to increase response rates to postal and electronic questionnaires. Cochrane Database Syst Rev 3Google Scholar
  52. ESOMAR (2013) Global market research 2013. ESOMAR, AmsterdamGoogle Scholar
  53. Esser H (1986) Über die Teilnahme an Befragungen. ZUMA Nachr 18:38–47Google Scholar
  54. European Commission (2012) Digital Agenda Scoreboard 2012. http://ec.europa.eu/information_society/digital-agenda/scoreboard/. Accessed 8 May 2014
  55. Evans JR, Mathur A (2005) The value of online surveys. Internet Res 15:195–219Google Scholar
  56. Fan W, Yan Z (2010) Factors affecting response rates of the Web survey: a systematic review. Comput Hum Behav 26:132–139Google Scholar
  57. Fang J, Shao P, Lan G (2009) Effects of innovativeness and trust on Web survey participation. Comput Hum Behav 25:144–152Google Scholar
  58. Fang J, Wen C, Pavur R (2012) Participation willingness in Web surveys: exploring effect of sponsoring corporation’s and survey provider’s reputation. Cyberpsychol Behav Soc Netw 15:195–199Google Scholar
  59. Faught KS, Green KW Jr, Whitten D (2004) Doing survey research on the Internet: yes, timing does matter. J Comput Inf Syst 44:26–34Google Scholar
  60. Felix LM, Burchett HE, Edwards PJ (2011) Factorial trial found mixed evidence of effects of pre-notification and pleading on response to Web-based survey. J Clin Epidemiol 64:531–536Google Scholar
  61. Festinger L (1954) A theory of social comparison processes. Hum Relat 7:117–140Google Scholar
  62. Festinger L (1957) A theory of cognitive dissonance. Stanford University Press, Stanford. First published by Row, Peterson and Company. Reissued 1970Google Scholar
  63. Fishbein M, Ajzen I (1975) Belief, attitude, intention, and behavior. An introduction to theory and research. Addison Wesley, ReadingGoogle Scholar
  64. Fittkau & Maaß (2013) Willkommen bei der W3B-Umfrage. http://www.w3b.org. Accessed 1 April 2013
  65. Fox RJ, Crask MR, Kim J (1988) Mail survey response rates. A meta-analysis of selected techniques for inducing response. Public Opin Q 52:467–491Google Scholar
  66. Furse DH, Stewart DW (1984) Manipulating dissonance to improve mail survey response. Psychol Mark 1:79–94Google Scholar
  67. Gajic A, Cameron D, Hurley J (2012) The cost-effectiveness of cash versus lottery incentives for a web-based, stated-preference community survey. Eur J Health Econ 13:789–799Google Scholar
  68. Galesic M (2006) Dropouts on the Web: effects of interest and burden experienced during an online survey. J Off Stat 22:313–328Google Scholar
  69. Galesic M, Bosnjak M (2009) Effects of questionnaire length on participation and indicators of response quality in a Web survey. Public Opin Q 73:349–360Google Scholar
  70. Ganassali S (2008) The influence of the design of Web survey questionnaires on the quality of response. Surv Res Method 2:21–33Google Scholar
  71. Göritz AS (2004a) Recruitment for online access panels. Int J Mark Res 46:411–425Google Scholar
  72. Göritz AS (2004b) The impact of material incentives on response quantity, response quality, sample composition, survey outcome, and cost in online access panels. Int J Mark Res 46:327–345Google Scholar
  73. Göritz AS (2005) Contingent versus unconditional incentives in www-studies. Metodološki zvezki. Adv Methodol Stat 2:1–14Google Scholar
  74. Göritz AS (2006a) Incentives in Web studies: methodological issues and a review. Int J Internet Sci 1:58–70Google Scholar
  75. Göritz AS (2006b) Cash lotteries as incentives in online panels. Soc Sci Comput Rev 24:445–459Google Scholar
  76. Göritz AS (2008) The long-term effect of material incentives on participation in online panels. Field Methods 20:211–225Google Scholar
  77. Göritz AS (2014) Determinants of the starting rate and the completion rate in online panel studies. In: Callegaro M, Baker R, Bethlehem J, Göritz AS, Krosnick JA, Lavrakas PJ (eds) Online panel research: a data quality perspective. Wiley, Chichester, pp 154–170Google Scholar
  78. Göritz AS, Crutzen R (2012) Reminders in Web-based data collection: increasing response at the price of retention? Am J Eval 33:240–250Google Scholar
  79. Göritz AS, Luthe SC (2013a) Lotteries and study results in market research online panels. Int J Mark Res 55:611–626Google Scholar
  80. Göritz AS, Luthe SC (2013b) Effects of lotteries on response behavior in online panels. Field Methods 25:219–237Google Scholar
  81. Göritz AS, Luthe SC (2013c) How do lotteries and study results influence response behavior in online panels. Soc Sci Comput Rev 31:371–385Google Scholar
  82. Göritz AS, Stieger S (2008) The high-hurdle technique put to the test: failure to find evidence that increased loading times enhance data quality in Web-based studies. Behav Res Methods 40:322–327Google Scholar
  83. Göritz AS, Stieger S (2009) The impact of the field time on response, retention and response completeness in list-based Web surveys. Int J Hum Comput St 67:342–348Google Scholar
  84. Göritz AS, Wolff HG (2007) Lotteries as incentives in longitudinal Web studies. Soc Sci Comput Rev 25:99–110Google Scholar
  85. Göritz AS, Wolff HG, Goldstein DG (2008) Individual payments as a longer-term incentive in online panels. Behav Res Methods 40:1144–1149Google Scholar
  86. Gouldner AW (1960) The norm of reciprocity: a preliminary statement. Am Sociol Rev 25:161–178Google Scholar
  87. Greif V, Batinic B (2007) Die Bedeutung des Einladungsschreibens für die Rücklaufquote in Online-Befragungen. Jahrb Absatz- Verbrauchsforsch 53:162–177Google Scholar
  88. Groves RM (2006) Nonresponse rates and nonresponse bias in household surveys. Public Opin Q 70:664–675Google Scholar
  89. Groves RM, Cialdini RB, Couper M (1992) Understanding the decision to participate in a survey. Public Opin Q 56:475–495Google Scholar
  90. Groves RM, Singer E, Corning A (2000) Leverage-salience theory of survey participation. Description and an illustration. Public Opin Q 64:299–308Google Scholar
  91. Groves RM, Presser S, Dipko S (2004) The role of topic interest in survey participation decisions. Public Opin Q 68:2–31Google Scholar
  92. Groves RM, Floyd FJ Jr, Couper MP, Lepowski JM, Singer E, Tourangeau R (2009) Survey methodology. Wiley-Interscience, HobokenGoogle Scholar
  93. Guéguen N (2003) Help on the Web: the effect of the same first name between the sender and the receptor in a request made by e-mail. Psychol Rec 53:459–466Google Scholar
  94. Guéguen N, Jacob C (2002a) Social presence reinforcement and coputer-mediated communication: the effect of the solicitor’s photography on compliance to a survey request made by e-mail. Cyberpsychol Behav 5:139–142Google Scholar
  95. Guéguen N, Jacob C (2002b) Soliciation by e-mail and solicitor’s status: a field study of social influence on the Web. Cyberpsychol Behav 5:377–383Google Scholar
  96. Guéguen N, Pichot N, Le Dreff G (2005) Similarity and helping on the Web: the impact of the convergence of surnames between a solicitor and a subject in a request made by e-mail. J Appl Soc Psychol 35:423–429Google Scholar
  97. Hackler JC, Bourgette P (1973) Dollars, dissonance and survey returns. Public Opin Q 37:276–281Google Scholar
  98. Han V, Albaum G, Wiley JB, Thirkell P (2009) Applying theory to structure respondents’ stated motivations for participating in web surveys. Qual Mark Res Int J 12:428–442Google Scholar
  99. Hansen RA, Robinson LM (1980) Testing the effectiveness of alternative foot-in-the-door manipulations. J Mark Res 17:359–364Google Scholar
  100. Hart AM, Brennan CW, Sym D, Larson E (2009) The impact of personalized prenotification on response rates to an electronic survey. West J Nurs Res 31:17–23Google Scholar
  101. Haunberger S (2011) Explaining unit nonresponse in online panel surveys: an application of the extended theory of planned behavior. J Appl Soc Psychol 41:2999–3025Google Scholar
  102. Havitz ME, Howard DR (1995) How enduring is enduring involvement? A seasonal examination of three recreational activities. J Consum Psychol 4:255–276Google Scholar
  103. Healy B, Macpherson T, Kuijten B (2005) An empirical evaluation of three Web survey design principles. Mark Bull 16:1–9Google Scholar
  104. Helgeson JG, Voss KV, Terpening WD (2002) Determinants of mail-survey response: survey design factors and respondent factors. Psychol Mark 19:303–328Google Scholar
  105. Heerwegh D (2005) Effects of personal salutation in e-mail invitations to participate in a Web survey. Public Opin Q 69:588–598Google Scholar
  106. Heerwegh D (2006) An investigation of the effect of lotteries on Web survey response rates. Field Methods 18:205–220Google Scholar
  107. Heerwegh D, Loosveldt G (2002) Web surveys. The effect of controlling survey access using PIN numbers. Soc Sci Comput Rev 20:10–21Google Scholar
  108. Heerwegh D, Loosveldt G (2003) An evaluation of the semiautomatic login procedure to control Web surveys access. Soc Sci Comput Rev 21:223–234Google Scholar
  109. Heerwegh D, Loosveldt G (2006) An experimental study on the effects of personalization, survey length statements, progress indicators, and survey sponsor logos in Web surveys. J Off Stat 22:191–210Google Scholar
  110. Heerwegh D, Loosveldt G (2007) Personalizing e-mail contacts: its influence on Web survey response rate and social desirability response bias. Int J Public Opin Res 19:258–268Google Scholar
  111. Heerwegh D, Vanhove T, Matthijs K, Loosveldt G (2005) The effect of personalization on response rates and data quality in Web surveys. Int J Soc Res Meth 8:85–99Google Scholar
  112. Joinson AN, Reips UD (2007) Personalized salutation, power of sender and response rates to Web-based surveys. Comput Hum Behav 23:1372–1383Google Scholar
  113. Joinson AN, Woodley A, Reips UD (2007) Personalization, authentication and self-disclosure in self-administered Internet surveys. Comput Hum Behav 23:275–285Google Scholar
  114. Kanuk L, Berenson C (1975) Mail surveys and response rates: a literature review. J Mark Res 12:440–453Google Scholar
  115. Kaplowitz MD, Hadlock TD, Levine R (2004) A comparison of Web and mail survey response rates. Public Opin Q 68:94–101Google Scholar
  116. Kaplowitz MD, Lupi F, Couper MP, Thorp L (2012) The effect of invitation design on Web survey response rates. Soc Sci Comput Rev 30:339–349Google Scholar
  117. Kehoe CM, Pitkow JE (1996) Surveying the territory: GVU’s five www user surveys. World Wide Web J 1:77–84Google Scholar
  118. Kent R, Brandal H (2003) Improving e-mail response in a permission marketing context. Int J Mark Res 45:489–503Google Scholar
  119. Keusch F (2012) Increasing response rates in list-based Web survey samples. Soc Sci Comput Rev 30:380–388Google Scholar
  120. Keusch F (2013) The role of topic interest and topic salience in online panel Web surveys. Int J Mark Res 55:59–80Google Scholar
  121. Keusch F, Batinic B, Mayerhofer W (2014) Motives for joining nonprobability online panels and their association with participation behavior. In: Callegaro M, Baker R, Bethlehem J, Göritz AS, Krosnick JA, Lavrakas PJ (eds) Online panel research: a data quality perspective. Wiley, Chichester, pp 171–191Google Scholar
  122. Klofstad CA, Boulianne S, Basson D (2008) Matching the message to the medium. Results from an experiment on Internet survey email contacts. Soc Sci Comput Rev 26:498–509Google Scholar
  123. Kroeber-Riel W, Esch FR (2004) Strategie und Technik der Werbung. Verhaltenswissenschaftliche Ansätze, 6th edn. Kohlhammer, StuttgartGoogle Scholar
  124. Laguilles JS, Williams EA, Saunders DB (2011) Can lottery incentives boost Web survey response rates? Findings from four experiments. Res High Educ 52:537–533Google Scholar
  125. LaRose R, Tsai H-YS (2014) Completion rates and non-response error in online surveys: comparing sweepstakes and pre-paid cash incentives in studies of online behavior. Comput Hum Behav 34:110–119Google Scholar
  126. Lozar Manfreda K, Vehovar V (2002) Survey design features influencing response rates in Web surveys. Paper presented at the international conference on improving survey, August 25–28, 2002. Copenhagen, DenmarkGoogle Scholar
  127. Lozar Manfreda K, Batagelj Z, Vehovar V (2002) Design of Web survey questionnaires: three basic experiments. J Comput Mediat Commun 7(3). http://jcmc.indiana.edu/vol7/issue3/vehovar.html. Accessed 18 April 2012
  128. Lozar Manfreda K, Bosnjak M, Berzelak J, Haas I, Vehovar V (2008) Web surveys versus other survey modes. A meta-analysis comparing response rates. Int J Mark Res 50:79–114Google Scholar
  129. Mahon-Haft TA, Dillman DA (2010) Does visual appeal matter? Effects of Web survey aesthetics on survey quality. Surv Res Method 4:43–59Google Scholar
  130. Marcus B, Schütz A (2005) Who are the people reluctant to participate in research? Personality correlates of four different types of nonresponse as inferred from self- and observer ratings. J Personal 73:959–984Google Scholar
  131. Marcus B, Bosnjak M, Lindner S, Pilischenko S, Schütz A (2007) Compensating for low topic interest and long surveys. A field experiment on nonresponse in Web surveys. Soc Sci Comput Rev 25:372–383Google Scholar
  132. Matzat U, Snijders C, van der Horst W (2009) Effects of different types of progress indicators on drop-out rates in Web surveys. Soc Psychol 40:43–52Google Scholar
  133. Mavletova A, Deviatko I, Maloshonok N (2014) Invitation design elements in Web surveys—can one ignore interactions? Bull Sociol Methodol 123:68–79Google Scholar
  134. McCambridge J, Kalaitzaki E, White IR, Khadjesari Z, Murray E, Linke S, Thompson SG, Godfrey C, Wallace P (2011) Impact of length or relevance of questionnaires on attrition in online trials: randomized controlled trial. J Med Internet Res 13(4):e96Google Scholar
  135. McCree-Hale R, De La Cruz NG, Montgomery AE (2010) Using downloadable songs from Apple iTunes as novel incentive for college students participating in a Web-based follow-up survey. Sci Health Promot 25:119–121Google Scholar
  136. Messer BL, Dillman DA (2011) Surveying the general public over the Internet using address-based sampling and mail contact procedures. Public Opin Q 75:429–457Google Scholar
  137. Miksza P, Roeder M, Biggs D (2010) Surveying Colorado band directors’ opinions of skills and characteristics important to successful music teaching. J Res Music Educ 57:364–381Google Scholar
  138. Milgram S (1974) Obedience to authority: an experimental view. Harper & Row, New YorkGoogle Scholar
  139. Millar MM, Dillman DA (2011) Improving response to Web and mixed-mode surveys. Public Opin Q 75:249–269Google Scholar
  140. Muncy JA, Hunt SD (1984) Consumer involvement: definitional issues and research directions. Adv Consum Res 11:193–196Google Scholar
  141. O’Neil KM, Penrod SD (2001) Methodological variables in Web-based research that may affect results: sample type, monetary incentives, and personal information. Behav Res Methods Instrum Comput 33:226–233Google Scholar
  142. O’Neil KM, Penrod SD, Bornstein BH (2003) Web-based research: methodological variables’ effects on dropout and sample characteristics. Behav Res Methods Instrum Comput 35:217–226Google Scholar
  143. Pan B, Woodside AG, Meng F (2013) How contextual cues impact response and conversion rates of online surveys. J Travel Res 53:58–68Google Scholar
  144. Parsons NL, Manierre MJ (2014) Investigating the relationship among prepaid token incentives, response rates, and nonresponse bias in a web survey. Field Methods 26:191–204Google Scholar
  145. Patrick ME, Singer E, Boyd CJ, Cranford JA, McCabe SE (2013) Incentives for college student participation in web-based substance use survey. Addict Behav 38:1710–1714Google Scholar
  146. Petrova PK, Cialdini RB, Sills SJ (2007) Consistency-based compliance across cultures. J Exp Soc Psychol 43:104–111Google Scholar
  147. Petty RE, Cacioppo JT (1986) Communication and persuasion: central and peripheral routes to attitude change. Springer, BerlinGoogle Scholar
  148. Pew Research Center (2014) The Web at 25 in the US. http://www.pewinternet.org/2014/02/25/the-web-at-25-in-the-u-s. Accessed 8 May 2014
  149. Peytchev A (2009) Survey breakoff. Public Opin Q 73:74–97Google Scholar
  150. Peytchev A (2011) Breakoff and unit nonresponse across Web surveys. J Off Stat 27:33–47Google Scholar
  151. Peytchev A, Couper MP, McCabe SE, Crawford SD (2006) Web survey design. Paging versus scrolling. Public Opin Q 70:596–607Google Scholar
  152. Pitkow JE, Recker MM (1994) Results from the first world-wide Web user survey. J Comput Netw ISDN Syst 27:243–254Google Scholar
  153. Porter SR, Whitcomb ME (2003a) The impact of lottery incentives on student survey response rates. Res High Educ 44:389–407Google Scholar
  154. Porter SR, Whitcomb ME (2003b) The impact of contact type on Web survey response rates. Public Opin Q 67:579–588Google Scholar
  155. Porter SR, Whitcomb ME (2004) Understanding the effect of prizes on response rates. New Dir Inst Res 212:51–62Google Scholar
  156. Porter SR, Whitcomb ME (2005a) E-mail subject lines and their effect on Web survey viewing and response. Soc Sci Comput Rev 23:380–387Google Scholar
  157. Porter SR, Whitcomb ME (2005b) Non-response in student surveys: the role of demographics, engagement and personality. Res High Educ 46:127–152Google Scholar
  158. Porter SR, Whitcomb ME (2007) Mixed-mode contacts in Web surveys. Paper is not necessarily better. Public Opin Q 71:635–648Google Scholar
  159. Porter SR, Whitcomb ME, Weitzer WH (2004) Multiple surveys of students and survey fatigue. New Dir Inst Res 212:63–73Google Scholar
  160. Postoaca A (2006) The anonymous elect. Market research through online access panels. Springer, BerlinGoogle Scholar
  161. Preece J, Johanson G, Hitchcock J (2010) Lottery incentives and online survey response rates. Surv Pract 3(4). http://www.surveypractice.org/index.php/SurveyPractice/article/view/138/html. Accessed 8 May 2014
  162. R Core Team (2014) A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna. http://www.R-project.org
  163. Reed JG, Baxter PM (1994) Using reference databases. In: Cooper H, Hedges LV (eds) The handbook of research synthesis. Russell Sage Foundation, New York, pp 57–70Google Scholar
  164. Reingen H, Kernan JB (1977) Compliance with an interview request: a foot-in-the-door, self-perception interpretation. J Mark Res 14:365–369Google Scholar
  165. Rothshild ML (1984) Perspectives on involvement: current problems and future directions. Adv Consum Res 11:216–217Google Scholar
  166. Sánchez-Fernández J, Munoz-Leiva F, Montoro-Ríos FJ, Ibánez-Zapata JÁ (2010) An analysis of the effect of pre-incentives and post-incentives based on draws on response to Web surveys. Qual Quant 44:357–373Google Scholar
  167. Sánchez-Fernández J, Munoz-Leiva F, Montoro-Ríos FJ (2012) Improving retention rate and response quality in Web-based surveys. Comput Hum Behav 28:507–514Google Scholar
  168. Sauermann H, Roach M (2013) Increasing Web survey response rates in innovation research: an experimental study of static and dynamic contact design features. Res Policy 42:273–286Google Scholar
  169. Sax LJ, Gilmartin SK, Bryant AN (2003) Assessing response rates and nonresponse bias in Web and paper surveys. Res High Educ 44:409–432Google Scholar
  170. Schillewaert N, Meulemeester P (2005) Comparing response distributions of offline and online data collection methods. Int J Mark Res 47:163–178Google Scholar
  171. Schillewaert N, Langerak F, Duhamel T (1998) Non-probability sampling for WWW surveys: a comparison of methods. J Mark Res Soc 40:307–322Google Scholar
  172. Shih TH, Fan X (2007) Response rates and mode preferences in Web-mail mixed-mode surveys: a meta-analysis. Int J Internet Sci 2:59–82Google Scholar
  173. Shih TH, Fan X (2008) Comparing response rates from Web and mail surveys: a meta-analysis. Field Methods 20:249–271Google Scholar
  174. Singer E (2011) Toward a benefit-cost theory of survey participation: evidence, further tests, and implications. J Off Stat 27:379–392Google Scholar
  175. Singer E, Ye C (2013) The use and effects of incentives in surveys. Ann Am Acad Polit Soc Sci 645:112–141Google Scholar
  176. Steeth CG (1981) Trends in nonresponse rates 1952–1979. Public Opin Q 45:40–57Google Scholar
  177. Stieger S, Göritz AS, Voracek M (2011) Handle with care: the impact of using Java applets in Web-based studies on dropout and sample composition. Cyberpsychol Behav Soc Netw 14:327–330Google Scholar
  178. Stingelbauer B, Gnambs T, Gamsäger M (2011) The interactive effects of motivations and trust in anonymity on adolescents’ enduring participation in Web-based social science research: a longitudinal behavior analysis. Int J Internet Res 6:29–43Google Scholar
  179. Sutherland MA, Amar AF, Laughon K (2013) Who send the email? Using electronic surveys in violence research. West J Emerg Med 14:363–369Google Scholar
  180. Svensson M, Svensson T, Hansen AW, Lagerros YT (2012) The effect of reminders in a Web-based intervention study. Eur J Epidemiol 27:333–340Google Scholar
  181. Thibaut JW, Kelly HH (1959) The social psychology of groups, 5th pr 1967. Wiley, New YorkGoogle Scholar
  182. Tourangeau R (2004) Survey research and social change. Annu Rev Psychol 55:775–801Google Scholar
  183. Tourangeau R, Groves RM, Kennedy C, Yan T (2009) The presentation of a Web survey, nonresponse and measurement error among members of Web panels. J Off Stat 25:299–321Google Scholar
  184. Tuten TL (1997) Getting a foot in the electronic door: understanding why people read or delete electronic mail. ZUMA-Arbeitsbericht Nr 97/08. ZUMA, MannheimGoogle Scholar
  185. Tuten TL, Bosnjak M, Bandilla W (2000) Banner-advertised Web surveys. Mark Res 11(4):16–21Google Scholar
  186. Tuten TL, Galesic M, Bosnjak M (2004) Effects of immediate versus delayed notification of prize draw results on response behavior in Web surveys. Soc Sci Comput Rev 22:377–384Google Scholar
  187. Trouteaud AR (2004) How you ask counts: a test of Internet-related components of response rates to Web-based surveys. Soc Sci Comput Rev 22:385–392Google Scholar
  188. Tybout AM, Yalch RF (1980) The effect of experience: a matter of salience? J Consum Res 6:406–413Google Scholar
  189. Villar A, Callegaro M, Yang Y (2013) Where am I? A meta-analysis of experiments on the effects of progress indicators for Web surveys. Soc Sci Comput Rev 31:744–762Google Scholar
  190. Waltson JT, Lissitz RW, Rudner LM (2006) The influence of Web-based questionnaire presentation variations on survey cooperation and perceptions of survey quality. J Off Stat 22:271–291Google Scholar
  191. Whitcomb ME, Porter SR (2004) E-mail contacts. A test of complex graphical design in survey research. Soc Sci Comput Rev 22:370–376Google Scholar
  192. Wiley JB, Han V, Albaum G, Thirkell P (2009) Selecting techniques for use in an Internet survey. Asian Pac J Mark Logist 21:455–474Google Scholar
  193. Wilson PM, Petticrew M, Calnan M, Nazareth I (2010) Effects of a financial incentive on health researchers’ responses to an online survey: a randomized controlled trial. J Med Internet Res 12(2):e13Google Scholar
  194. Yan T, Conrad FG, Tourangeau R, Couper MP (2011) Should I stay or should I go: the effect of progress feedback, promised task duration, and length of questionnaire on completing Web surveys. Int J Public Opin Res 23:131–147Google Scholar
  195. Yarger JB, James TA, Ashikaga T, Hayanga AJ, Takyi V, Lum Y, Kaiser H, Mammen J (2013) Characteristics in response rates for surveys administered to surgery residents. Surg 154:38–45Google Scholar
  196. Yu J, Cooper H (1983) A quantitative review of research design effects on response rates to questionnaires. J Mark Res 20:36–44Google Scholar
  197. Ziegenfuss JY, Niederhauser BD, Kallmes D, Beebe TJ (2013) An assessment of incentive versus survey length trade-offs in a web survey of radiologists. J Med Internet Res 15:e49Google Scholar
  198. Zillman D, Schmitz A, Skopek J, Blossfeld H-P (2014) Survey topic and unit nonresponse. Evidence from an online survey on mating. Qual Quant 48:2069–2088Google Scholar

Copyright information

© Wirtschaftsuniversität Wien, Austria 2015

Authors and Affiliations

  1. 1.Lehrstuhl für Statistik und sozialwissenschaftliche MethodenlehreUniversität MannheimMannheimGermany

Personalised recommendations