Telephone Surveys via Landline and Mobile Phones: Mode Effects and Response Quality

  • Mike Kühne
  • Michael Häder


Telephone surveys play a central role in survey-based research. This becomes clear when considering the widespread use and data quality of telephone surveys (Lepkowski et al. 2008). At the same time, the use of telephone surveys is influenced by social and technological changes. Most prominently, mobile phone technology poses a serious challenge to survey research worldwide (AAPOR 2008, Steeh and Piekarski 2008, Häder and Häder 2009). There is increasing concern that a considerable number of households that are currently supposed to be covered by landline sampling frames can, in fact, not be reached by landline telephone surveys because they rely predominantly on mobile phones (Brick et al. 2006, Keeter et al. 2007). This “mobile-mostly” population currently represents 15 % of adults in United States households. Boyle and Lewis (2009), nevertheless, were able to show that households that consider themselves “mobile-mostly” are still likely to answer their landline telephones. A growing number of mobile phone users have substituted their residential landline telephone for a mobile phone. These so called “mobile-phone-only households” may have implications for the representativeness of telephone surveys, since sampling frames for these surveys have traditionally been limited to landline phone numbers. The non-coverage of households without landline telephones may bias the estimates derived from telephone surveys. Recent studies assume that the number of the mobile-only households is growing worldwide. In Europe this share ranges from 3 % in Sweden to 64 % in the Czech Republic (Häder et al. 2009: 23, European Commission 2008: 31). In the United States these households constitute more than 22 % of households overall (Blumberg and Luke 2009).


Mobile Phone Social Desirability Telephone Survey Response Behavior Dual Frame 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. AAPOR (The American Association for Public Opinion Research) Guidelines and Considerations for Survey Researchers When Planning and Conducting RDD and Other Telephone Surveys in the U.S. with Respondents Reached via Cell Phone Numbers (2008), (accessed July 31, 2010)
  2. AAPOR (American Association for Public Opinion Research). New Considerations for Survey Researchers When Planning and Conducting RDD Telephone Surveys in the U.S. With Respondents Reached via Cell Phone Numbers (2010), (accessed January 10, 2011)
  3. Béland, Y., St-Pierre, M.: Mode Effects in the Canadian Community Health Survey: A Comparision of CATI and CAPI. In: Lepkowski, J.M., et al. (eds.) Advances in Telephone Survey Methodology, pp. 297–316. Wiley, New York (2008)Google Scholar
  4. Blumberg, S.J., Luke, J.V.: Wireless Substitution: Early Release of Estimates From the National Health Interview Survey, July-December 2008. Division of Health Interview Statistics, National Center for Health Statistics (May 2009), (accessed July 31, 2010)
  5. Boyle, J.M., Lewis, F.: Cell Phone Mainly Households: Coverage and Reach for Telephone Surveys Using RDD Landline Samples. Survey Practice (2009), (accessed July 31, 2010)
  6. Brick, J.M., Brick, P.D., Dipko, S., Presser, S., Tucker, C., Yuan, Y.: Cell Phone Survey Feasibility in the U. S.: Sampling and Calling Cell Numbers Versus Landline Numbers. Public Opinion Quarterly 71, 23–39 (2007)CrossRefGoogle Scholar
  7. Brick, J.M., Dipko, S., Presser, S., Tucker, C., Yuan, Y.: Nonresponse Bias in a Dual Frame Sample of Cell and Landline Numbers. Public Opinion Quarterly 70, 780–793 (2006)CrossRefGoogle Scholar
  8. Callegaro, M., Ayhan, Ö., Häder, S., Gabler, S.: Combining landline and mobile phone samples: a dual frame approach (2010) (submitted)Google Scholar
  9. Cannell, C.F., Miller, P.V., Oksenberg, L.: Research on Interviewing Techniques. In: Leinhardt, S. (ed.) Sociological Methodology, pp. 389–437. Jossey-Bass, San Francisco (1981)Google Scholar
  10. Converse, P.E.: The Nature of Belief Systems in Mass Publics. In: Apter, D.E. (ed.) Ideology and Discontent, pp. 206–261. Free Press, New York (1964)Google Scholar
  11. Corti, L., Clissold, K.M.: Response contamination by third parties in a household interview survey. In: Working Papers of the ESRC Research Centre on Micro-Social Change, Working Paper 13. University of Essex, Colchester (1992)Google Scholar
  12. De Leeuw, E.D.: To mix or not to mix data collection modes in surveys. Journal of Official Statistics 21(5), 233–255 (2005)Google Scholar
  13. Dillman, D.A., Smyth, J., Christian, L.M.: Mail and Internet Surveys. In: The Tailored Design Method. Wiley, New Jersey (2008)Google Scholar
  14. Driscoll, D.L.: Paraphrase: Write it in Your Own Words. Purdue University Online Writing Lab (2007), (accessed July 31, 2010)
  15. Edwards, A.L.: The Social Desirability Variable in Personality Assessment and Research. Dryden, New York (1957)Google Scholar
  16. European Commission. E-Communications Household Survey. Eurobarometer Spezial 293/Welle 68.2 (2008), (accessed July 31, 2010)
  17. Fleeman, A.: Merging Cellular and Landline RDD Sample Frames: A Series of Three Cell Phone Studies. In: Paper presented at the Second International Conference on Telephone Survey Methodology, Miami, FL (2006)Google Scholar
  18. Fordyce, W.E.: Social Desirability in the MMPI. Journal of Consulting Psychology 3(20), 171–175 (1956)CrossRefGoogle Scholar
  19. Gabler, S., Häder, S.: Die Kombination von Mobilfunk- und Festnetzstichproben in Deutschland (The combination of mobile and landline samples). In: Weichold, M., et al. (eds.) Umfrageforschung. Herausforderung und Grenzen. Österreichische Zeitschrift für Soziologie, Special, vol. 9, pp. 239–252. VS Verlag für Sozialwissenschaften, Wiesbaden (2009)Google Scholar
  20. Gabler, S., Häder, S.: Idiosyncrasies in telephone sampling. The case of Germany. International Journal of Public Opinion Research 14, 338–345 (2002)Google Scholar
  21. Graeske, J., Kunz, T.: Stichprobenqualität der CELLA-Studie unter besonderer Berücksichtigung der Mobile-onlys (Sampling quality of the CELLA study with special emphasis on Mobile onlys). In: Häder, M., Häder, S. (eds.) Telefonbefragungen über das Mobilfunknetz, pp. 57–70. VS Verlag für Sozialwissenschaften, Wiesbaden (2009)CrossRefGoogle Scholar
  22. Groves, R.M., Dillman, D.A., Eltinge, J.L., Little, R.J.A.: Survey nonresponse. Wiley, New York (2002)Google Scholar
  23. Groves, R.M., Floyd Jr., J.F., Couper, M., Lepkowski, J.M., Singer, E., Tourangeau, R.: Survey Methodology. Wiley, New Jersey (2004)Google Scholar
  24. Häder, M., Häder, S.: Telefonbefragungen über das Mobilfunknetz (telephon surveys via mobile phone). VS Verlag für Sozialwissenschaften, Wiesbaden (2009)CrossRefGoogle Scholar
  25. Häder, S., Gabler, S., Heckel, C.: Stichprobenziehung. In: Häder, M., Häder, S. (eds.) Telefonbefragungen über das Mobilfunknetz, Konzept, Design und Umsetzung Einer Strategie zur Datenerhebung, pp. 21–45. VS Verlag für Sozialwissenschaften, Wiesbaden (2009)Google Scholar
  26. Hartmann, P.: Wunsch und Wirklichkeit: Theorie und Empirie sozialer Erwünschtheit (Hope and Reality: Theory and Practice of social desirability). Deutscher Universitäts-Verlag, Wiesbaden (1991)Google Scholar
  27. Hofstätter, P.R.: Bedingungen der Zufriedenheit (Conditions of satisfaction). Edition Interfrom, Zürich (1986)Google Scholar
  28. Hunsicker, S., Schroth, Y.: Combining mobile phone and landline phone samples. A practical application of the dual frame approach. MDA – Methoden, Daten, Analysen. Zeitschrift für Empirische Sozialforschung 2(1), 161–182 (2007)Google Scholar
  29. Hyman, H.H., Sheatsley, P.B.: The Current Status of American Public Opinion. In: Payne, J.C. (ed.) The Teaching of Contemporary Affairs. Twenty-first Yearbook of the National Council of Social Studies, pp. 11–34. National Education Association, New York (1950)Google Scholar
  30. Jäckle, A., Roberts, C., Lynn, P.: Assessing the Effect of Data Collection Mode on Measurement. ISER Working Papers 2008-08 (2008), (accessed July 31, 2010)
  31. Kane, J.G., Craig, S.C., Martinez, M.D.: Ambivalence, Attitude Strength, and Response Instability: A Two-Wave Panel Study of Abortion Attitudes in Florida. Paper presented at the 2000 Annual Meeting of the American Political Science Association, Washington (2000)Google Scholar
  32. Keeter, S., Kennedy, C.K., Clark, A., Tompson, T., Mokrzycki, M.: What’s Missing from National Landline RDD Surveys? The Impact of the Growing Cell-only Population. Public Opinion Quarterly 71, 772–792 (2007)CrossRefGoogle Scholar
  33. Kennedy, C.K.: Nonresponse and Measurement Error in Mobile Phone Surveys. PhD Thesis, University of Michigan (2010)Google Scholar
  34. Kreuter, F., Presser, S., Tourangeau, R.: Social desirability bias in cati, IVR, and web surveys. Public Opinion Quarterly 5(72), 847–865 (2008)CrossRefGoogle Scholar
  35. Krosnick, J.A., Alwin, D.F.: An Evaluation of a Cognitive Theory of Response-Order Effects in Survey Measurement. Public Opinion Quarterly 2(51), 201–219 (1987)CrossRefGoogle Scholar
  36. Krosnick, J.A.: Response strategies for coping with the cognitive demands of attitude measures in surveys. Journal of Applied Cognitive Psychology 5, 213–236 (1999)CrossRefGoogle Scholar
  37. Krosnick, J.A.: The threat of satisficing in surveys: the shortcuts respondents take in answering questions. Survey Methods Newsletter 20, 4–8 (2009)Google Scholar
  38. Kühne, M., Böhme, R.: Effekte von Informationsstand, Wissen und Einstellungsstärke von Befragten auf die Antwortstabilität in Online-Befragungen mit Selbstrekrutierung (Effects of level of information, knowledge and attitudes of respondents on the strength of response stability in online surveys with self-recruitment). ZUMA-Nachrichten 59, 42–71 (2006)Google Scholar
  39. Kuusela, V., Callegaro, M., Vehovar, V.: The Influence of Mobile Telephones on Telephone Surveys. In: Lepkowski, J.M., et al. (eds.) Advances in Telephone Survey Methodology, pp. 87–112. Wiley, New York (2008)Google Scholar
  40. Lavrakas, P.J., Steeh, C., Shuttles, C., Fienberg, H.: The State of Surveying Cell Phone Numbers in the United States: 2007 and Beyond. Public Opinion Quarterly 5(71), 840–854 (2007)CrossRefGoogle Scholar
  41. Lavrakas, P.J., Tompson, T.N., Benford, R.: Investigating Data Quality in Cell Phone Surveying. In: Paper Presented at the Annual Conference of the Midwest Association for Public Opinion Research, Chicago (2009)Google Scholar
  42. Lepkowski, J.M., Tucker, C., Brick, J.M., Leeuw, E., de Japec, L., Lavrakas, P.J., Link, M.W., Sangster, R.L.: Advances in Telephone Survey Methodology. Wiley, New York (2008)Google Scholar
  43. Mason, R., Carlson, J.E.: Contrast Effects and Subtraction in Part-Whole Questions. Public Opinion Quarterly 4(58), 569–578 (1994)CrossRefGoogle Scholar
  44. Mokrzycki, M., Keeter, S., Kennedy, C.: Cell-Phone-Only Voters in the 2008 Exit Poll and Implications for Future Non-coverage Bias. Public Opinion Quarterly 5(73), 845–865 (2010)Google Scholar
  45. Noelle-Neumann, E., Petersen, T.: Das halbe Instrument, die halbe Reaktion. Zum Vergleich von Telefon- und Face-to-Face Umfragen (Half the instrument, the half reaction. For comparison of telephone and face-to-face surveys). In: Hüfken, V. (ed.) Methoden in Telefonumfragen, pp. 183–200. Westdeutscher Verlag, Opladen (2000)Google Scholar
  46. Paulhus, D.L.: Two-Component Models of Social Desirability. Journal of Personality and Social Psychology 3(46), 598–609 (1984)CrossRefGoogle Scholar
  47. Reuband, K.-H.: Dritte Personen beim Interview - Zuhörer, Adressaten oder Katalysatoren der Kommunikation? (Third person being interviewed - listener, addressee or catalysts of communication?). In: Meulemann, H., Reuband, K.-H. (eds.) Soziale Realität im Interview, pp. 117–156. Campus, Frankfurt (1984)Google Scholar
  48. Rodgers, L.W., Herzog, A.R.: Interviewing older adults. The accuracy of factual information. Journal of Gerontology 42, 389–394 (1987)Google Scholar
  49. Schuman, H., Presser, S.: Questions and Answers in Attitude Survey. Sage Publications, Inc. (1981)Google Scholar
  50. Shoemaker, P.J., Eichholz, M., Skewes, E.A.: Item Nonresponse: Distinguishing between don’t Know and Refuse. International Journal of Public Opinion Research 14, 193–201 (2002)CrossRefGoogle Scholar
  51. Singer, E.: Introduction - Nonresponse bias in household surveys. Public Opinion Quarterly 70(5), 637–645 (2006)CrossRefGoogle Scholar
  52. Slaby, M.: Zur Interaktion zwischen Befragten und Erhebungsinstrument (For interaction between respondents and survey instrument). SISS – Schriftenreihe des Instituts für Sozialwissenschaften der Universität Stuttgart, No. 3/1998 (1998), (accessed July 31, 2010)
  53. Slymen, D.J., Drew, J.A., Wright, B.L., Elder, J.P., Williams, S.J.: Item non-response to lifestyle assessment in an elderly kohort. International Journal of Epidemiology 23, 583–591 (1994)CrossRefGoogle Scholar
  54. Smith, T.W.: The Impact of the Presence of Others on a Respondent’s Answers to Questions. International Journal of Public Opinion Research 1(9), 33–47 (1997)CrossRefGoogle Scholar
  55. Steeh, C.G., Piekarski, L.: Accommodating New Technologies: Mobile and VOIP Communication. In: Lepkowski, J.M. (ed.) Advances in Telephone Survey Methodology, pp. 423–448. Wiley, New York (2008)Google Scholar
  56. Strack, F., Martin, L.L.: Thinking, judging, and communicating: A process account of context effects in attitude surveys. In: Hippler, H.-J., et al. (eds.) Social Information Processing and Survey Methodology, pp. 123–148. Springer, Heidelberg (1987)CrossRefGoogle Scholar
  57. Strack, F.: Zur Psychologie der standardisierten Befragung: kognitive und kommunikative Prozesse (On the psychology of the standardized interview: cognitive and communicative processes). Springer, Berlin (1994)Google Scholar
  58. Steeh, C: A New Era for Telephone Surveys. Paper presented at the Annual Conference of the American Association for Public Opinion Research, Phoenix, AZ (2004)Google Scholar
  59. Sudman, S., Bradburn, N.M.: Response Effects in Surveys: A review and synthesis. Aldine publishing company, Chicago (1974)Google Scholar
  60. Sudman, S., Bradburn, N.M., Schwarz, N.: Thinking about Answers. In: The Application of Cognitive Processes to Survey Methodology. Jossey-Bass, San Francisco (1996)Google Scholar
  61. Tourangeau, R., Smith, T.W.: Asking Sensitive Questions: The Impact of Data Collection, Question Format, and Question Context. Public Opinion Quarterly 2(60), 275–304 (1996)CrossRefGoogle Scholar
  62. Tourangeau, R., Rips, L.J., Rasinski, K.A.: The Psychology of Survey Response. Cambridge University Press, New York (2000)Google Scholar
  63. Tucker, C., Lepkowski, J.M.: Telephone Survey Methods: Adapting to Change. In: Lepkowski, J., et al. (eds.) Advances in Telephone Survey Methodology, pp. 3–28. Wiley, New York (2008)Google Scholar
  64. Van den Bulck, J.: Does the Presence of a Third Person Affect Estimates of TV Viewing and Other Media Use? Communications 1(24), 105–116 (1999)Google Scholar
  65. Willits, F.K., Saltiel, J.: Question Order Effects on Subjective Measures of Quality of Life. Rural Sociology 4(60), 654–665 (1995)Google Scholar
  66. Winkler, N., Kroh, M., Spiess, M.: Entwicklung einer deutschen Kurzskala zur zweidimensionalen Messung von sozialer Erwünschtheit. Discussion Papers 579. DIW Berlin, Berlin (2006), (accessed July 31, 2010)

Copyright information

© Springer-Verlag GmbH Berlin Heidelberg 2011

Authors and Affiliations

  • Mike Kühne
    • 1
  • Michael Häder
    • 1
  1. 1.The Institute for SociologyThe Dresden University of TechnologyDresdenGermany

Personalised recommendations