Zusammenfassung
Im vorliegenden Beitrag wird zunächst ein zentrales Defizit der Qualitätsbestimmung in der Umfrageforschung dargestellt, nämlich die Tatsache, dass Datenqualität in vielen Fällen nur an einigen wenigen – nicht immer geeigneten – Indikatoren festgemacht wird und nicht am umfassenden Maß des Mean Square Error. Daran anschließend wird aufgezeigt, auf welche Weise aktuelle Entwicklungen in der Umfrageforschung die Datenqualität herausfordern und dass deren Auswirkungen auf die Qualität von Schätzern nur dann umfassend bewertet werden können, wenn eine am Total Survey Error orientierte Betrachtung vorgenommen wird.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
Literatur
AAPOR. (2008). Standard definitions. Final dispositions of case codes and outcome rates for surveys. Alexandria: AAPOR.
AAPOR. (2010). AAPOR report on online panels: AAPOR.
ADM Deutscher Markt- und Sozialforschungsinstitute, & AG.MA Arbeitsgemeinschaft Media-Analyse (Eds.). (1999). Stichproben-Verfahren in der Umfrageforschung. Eine Darstellung für die Praxis. Opladen: Leske + Burdich.
Arce-Ferrer, A. J. (2006). An investigation into the factors influencing extremeresponse style: improving meaning of translated and culturally adapted rating scales. Educational and Psychological Measurement, 66(3), 374–392.
Baumgartner, H., & Steenkamp, J.-B. E. M. (2001). Response styles in marketing research: a cross-national investigation. Journal of Market Research, 38(2), 143–156.
Biemer, P., & Lyberg, L. (2003). Introduction to survey quality. Hoboke, NJ: Wiles.
Blumberg, S. J., Luke, J. V., Davidson, G., Davern, M. E., & Yu, T.-C. (2009). Wireless substitution: State-level estimates from the National Health Interview Survey, January–December 2007. Washington, DC: National Center for Health Statistics.
Bosnjak, M., Neubarth, W., Couper, M. P., Bandilla, W., & Kaczmirek, L. (2007). Prenotification in web-based access panel surveys. The influence of mobile text messaging versus e-mail on response rates and sample composition. Social Science Computer Review OnlineFirst.
Castiglioni, L., Pforr, K., & Krieger, U. (2008). The effect of incentives on response rates and panel attrition: results of a controlled experiment. Survey Research Methods, 2(3), 151–158.
Christian, L. M., Dillman, D. A., & Smyth, J. D. (2006). The effects of mode and format on answers to scalar questions in telephone and web surveys. Paper presented at the Telephone Survey Methodology II.
Couper, M. P. (2000). Usability evaluation of computer-assisted survey instruments. Social Science Computer Review, 18(4), 384–396.
Couper, M. P., Tourangeau, R., & Conrad, F. G. (2007). Visual context effects in Web surveys. Public Opinion Quarterly, 71(4), 623–634.
Couper, M. P. (2008). Designing effective Web surveys. New York: Cambridge University Press.
Couper, M. P., Fuchs, M., Hansen, S. E., & Sparks, P. (1997). CAPI instrument design for the Consumer Expenditure (CE) quarterly interview survey. Final report to the Bereau of Labor Statistics. Ann Arbor, MI: Survey Research Center, University of Michigan.
Couper, M. P., Tourangeau, R., & Conrad, F. G. (2007). Visual context effects in Web surveys. Public Opinion Quarterly, 71(4), 623–634.
Couper, M. P., Tourangeau, R., & Kenyon, K. (2004). Picture this! Exploring visual design effects in Web surveys. Public Opinion Quarterly, 68(2), 255–266.
Dayan, Y., Schofield Paine, C., & Johnson, A. (2007). Responding to sensitive questions in surveys: A comparison of results from Online panels, face to face, and self-completion interviews. Paper presented at the WAPOR 60th Annual Conference.
De Leeuw, E. D., & de Heer, W. (2002). Trends in household survey nonresponse. In R. Groves, D. Dillman, J. L. Eltinge & R. J. A. Little (Eds.), Survey nonresponse (pp. 41–54). New york: Wiley.
Dillman, D. A. (2005). Mixed-mode surveys. In S. J. Best & B. Radcliff (Eds.), Polling America: An encyclopedia of public opinion (Vol. 2, pp. 149–153). Westport: Greenwood Press.
Dillman, D. A. (2007). Mail and Internet surveys: The tailored design method -- 2007 Update with new Internet, visual, and mixed-mode guide. New York: Wiley.
Dillman, D. A., Phelps, G., Tortora, R., Swift, K., Kohrell, J., & Berck, J. (2001). Response rates and measurement differences in mixed mode surveys: using mail, telephone, interactive voice response and the internet.
DiSogra, C., & Callegaro, M. (2009). Computing response rates for probabilitybased web panels. Paper presented at the Annual Conference of the American Association for Public Opinion Research.
Durrant, G. B., Groves, R. M., Staetsky, L., & Steele, F. (2010). Effects of interviewer attitudes and behaviors on refusal in household surveys. Public Opinion Quarterly, 74(1), 1–36.
Fisher, R. J. (1993). Social desirability bias and the validity of indirect questioning. Journal of Consumer Research, 20(2), 303–315.
Fricker, R. D. j., & Schonlau, M. (2002). Advantages and disadvantages of internet research surveys: evidence from the literature. Field Methods, 14(4), 347–367.
Fuchs, M. (2008). Total survey error. In P. J. Lavrakas (Ed.), Encyclopedia of survey research methods (Vol. 2, pp. 896–902). Thousand Oaks, CA: Sage.
Fuchs, M. (2008b). Mobile Web Survey: A preliminary discussion of methodological implications. In F. G. Conrad & M. F. Schober (Eds.), Envisioning the survey interview of the future (pp. 77–94). New York: Wiley.
Fuchs, M. (2009). Asking for numbers and quantities. Visual design effects in paper&pencil surveys International Journal of Public Opinion Research, 21(1), 65–84.
Fuchs, M. (2009b). (Optimal) governance of research support by “Survey Methodology”. Berlin: Rat für Sozial- und Wirtschaftsdaten.
Fuchs, M. (2009c). Impact of school context on violence at schools. A multi-level analysis. International Journal on Violence and Schools, 7(1), 20–42.
Fuchs, M. (2010). The relative coverage bias in landline telephone samples across Europe. The impact of the mobile-only population. under review.
Fuchs, M., & Busse, B. (2009). The coverage bias of mobile Web surveys across European countries. International Journal of Internet Science, 4(1), 21–33.
Fuchs, M., & Funke, F. (2007). Video Web Survey. Results of an experimental comparison with a text-based Web survey. In M. Trotman (Ed.), Challenges of a changing world. Proceedings of the Fifth International Conference of the Association for Survey Computing (pp. 63–80). Berkeley: Association for Survey Computing.
Fuchs, M., & Funke, F. (2008). Die Video-unterstützte Online-Befragung: Soziale Präsenz, soziale Erwünschtheit und Underreporting sensitiver Informationen. In N. Jackob, H. Schoen & T. Zerback (Eds.), Sozialforschung im Internet. Methodologie und Praxis der Online-Befragung (pp. 159–180). Wiesbaden: VS Verlag.
Funke, F., & Reips, U. D. (2007). Dynamic forms: Online surveys 2.0. Paper presented at the German Online Research Conference (GOR) 2007.
Gabler, S., & Häder, S. (1999). Erfahrungen bei Aufbau eines Auswahlrahmens für Telefontichproben in Deutschland. ZUMA Nachrichten, 44, 45–61.
Gabler, S., & Ayhan, Ö. (2007). Gewichtung bei der Erhebung im Festnetz und über Mobilfunk. Ein Dual Frame Ansatz. ZUMA-Nachrichten Spezial(13), 39–46.
Gerich, J., & Lehner, R. (2006). Video computer-assisted self-administered interviews for deaf respondents. Field Methods, 18(3), 267–283.
Greenleaf, E. A. (1992). Measuring extreme response style. Public Opinion Quarterly, 56(3), 328–351.
Göritz, A. S. (2008). The long-term effect of material incentives on participation in online panels. Field Methods, 20(3), 211–225.
Groves, R. M. (1983). Implications of CATI: Costs, errors, and organization of telephone survey research. Sociological Methods Research, 12(2), 199–215.
Groves, R. M., Biemer, P. P., Lyberg, L., Massey, J. T., Nicholls, W. L., & Waksberg, J. (1988). Telephone survey methodology. New York: Wiley.
Groves, R. M. (1989). Survey errors and survey costs. New York: Wiley.
Groves, R. (2006). Nonresponse rates and nonresponse bias in household surveys Public Opinion Quarterly, 70(5), 646–675.
Groves, R. M., & Couper, M. P. (1998). Nonresponse in household interview surveys. New York: Wiley.
Groves, R. M., Fowler, F. J., Couper, M. P., Lepkowski, J. M., Singer, E., & Tourangeau, R. (2010): Survey Methodology (2 nd Edition). Hoboken, New Jersey: Wiley.
Häder, M., & Häder, S. (Eds.). (2009). Telefonbefragungen über das Mobilfunknetz. Konzept, Design und Umsetzung einer Strategie zur Datenerhebung. Wiesbaden: VS Verlag.
Harkness, J., Van De Vijver, F. J. R., & Mohler, P. (2002). Cross-cultural survey methods. New York: Wiley.
Harrison, C. H. (2005). Coverage error. In S. J. Best & B. Radcliff (Eds.), Polling America: An encyclopedia of public opinion (Vol. 1, pp. 134–140). Westpost: Greenwood.
Heerwegh, D., & Loosveldt, G. (2008). Face-to-face versus Web surveying in a high-Internet-coverage population. Differences in response quality. Public Opinion Quarterly, 72(5), 836–846.
Heerwegh, D., Abts, K., & Loosveldt, G. (2007). Minimizing survey refusal and noncontact rates; do our efforts pay off? Survey Research Methods, 1(1), 3–10.
Heckel, C. (2007). Weiterentwicklung der ADM-Auswahlgrundlagen. ZUMANachrichten Spezial(13), 25–38.
Heyde, C. v. d. (2002). Das ADM-Telefonstichprobenmodell. In S. Gabler & S. Häder (Eds.), Telefonstichproben. Methodische Innovationen und Anwendungen in Deutschland (pp. 32–45). Münster: Waxmann.
Hoffmann, E. (1997). Capturing „industry“ in population censuses and surveys. Reflections on some methodological issues. Paper presented at the Third Meeting of the Expert Group on International Economic and Social Classifications.
Hofmann, O. (2007). Qualitätsstandards bei Online-Access Panels. In C. König, M. Stahl & E. Wiegand (Eds.), Qualitätsmanagement und Qualitätssicherung (pp. 51–62). Bonn: GESIS.
Johnson, T. P. (1998). Approaches to equivalence in cross-cultural and crossnational survey research. ZUMA-Nachrichten Spezial, 3, 1–40.
Kaase, M. (Ed.). (1999). Qualitätskriterien der Umfrageforschung /Quality Criteria for Survey Research. Denkschrift /Memorandum. Berlin: Akademie Verlag.
Kalsbeek, W. D., & Agans, R. P. (2008). Sampling abd weighting in household telephone surveys. In J. M. Lepkowski, C. Tucker, J. M. Brick, E. D. De Leeuw, L. Japec, P. J. Lavrakas, M. W. Link & R. L. Sangster (Eds.), Advances in telephone survey methodology (pp. 29–55). New York: Wiley.
Kalton, G. (2009). Methods for oversampling rare subpopulations in social surveys. Survey Methodology, 35(2), 125–141.
Kennedy, C. (2007). Evaluating the effects of screening for telephone service in dual frame RDD surveys. Public Opinion Quarterly, 71(5), 750–771.
Kim, J., Kang, J.-h., Kim, S., Smith, T. W., Son, J., & Berktold, J. (2009). Comparison between self-administered questionnaire and computer-assisted selfinterview for supplemental survey nonresponse. Field Methods, 22(1), 57–69.
King, G., Murray, C. J. L., Salomon, J. A., & Tandon, A. (2004). Enhancing the validity and cross-cultural comparability of measurement in survey research. American Political Science Review, 98(1), 191–207.
Kish, L. (1965). Survey Sampling. New York: Wiley.
Kish, L., & Frankel, M. R. (1974). Inference from complex samples. Journal of the Royal Statistical Society. Series B, 36(1), 1–37
Koch, A. (1997). ADM-Design und Einwohnermelderegister-Stichprobe. Stichprobenverfahren bei mündlichen Bevölkerungsumfragen. In S. Gabler & J. H. P. Hoffmeyer-Zlotnik (Eds.), Stichproben in der Umfragepraxis (pp. 99–116). Opladen: Westdeutscher Verlag.
Kreuter, F., Presser, S., & Tourangeau, R. (2008). Social desirability bias in CATI, IVR, and Web surveys: The effects of mode and question sensitivity. Public Opinion Quarterly, 72(5), 847–865.
Krosnick, J. A. (1991). Response strategies for coping with the cognitive demands of attitude measures in surveys. Applied Cognitive Psychology, 5, 213–236.
Krosnick, J. A. (1999). Survey research. Annual Review of Psychology, 50, 537–567.
Lavrakas, P. J. (1993). Telephone survey methods. Sampling, selection, and supervision (2nd ed.). Newbury Park: Sage.
Lee, S., & Valliant, R. (2008). Weighting telephone samples using propensity scores. In J. M. Lepkowski, C. Tucker, J. M. Brick, E. D. De Leeuw, L. Japec, P. J. Lavrakas, M. W. Link & R. L. Sangster (Eds.), Advances in telephone survey methodology (pp. 170–186). New York: Wiles.
Lind, L. H., Schober, M., & Conrad, F. G. (2008). Social cues can affect answers to threatening questions in virtual interviews. Paper presented at the Annual conference of the American Association of Public Opinion Research.
Link, M. W., & Mokdad, A. (2005). Alternative modes for health surveillance surveys: An experiment with Web, mail, and telephone. Epidemiology, 16(5), 701–704.
Lohr, S. L. (1999). Sampling: Design and Analysis. Pacific Grove: Duxbury Press.
Lynn, P., & Gabler, S. (2004). Approximations to b * in the prediction of design effects due to clustering. Essex: Institute for Social and Economic Research.
Lynn, P., Jäckle, A., Jenkins, S., & Sala, E. (2004). The impact of interviewing method on measurement error in panel survey measures of benefit receipt: evidence from a validation study. Essex, UK: Institute for Social and Economic Research
Lynn, P., & Kaminska, O. (2010). The impact of mobile phones on survey measurement error. Paper presented at the Mobile Research Conference 2010.
Neubarth, W., Bosnjak, M., Bandilla, W., Couper, M. P., & Kaczmirek, L. (2005). Pre-notification in online access panel surveys: E-mail versus mobile text messaging (SMS). Paper presented at the Paper presented at the Consumer Personality & Research Conference.
Neubarth, W., & Kaczmirek, L. (2007). Applications of the document object model (DOM) in Web-surveys. Paper presented at the Workshop on Internet Survey Methodology.
Okazaki, S. (2007). Assessing mobile-based online surveys. Methodological considerations and pilot study in an advertising context. International Journal of Market Research, 49(5), 651–675.
Park, S.-e., Choi, D., & Kim, J. (2004). Critical factors for the aestethic fidelity of web pages: empirical studies with professional web designers and users. Interacting with Computers, 16, 351–376.
Peytchev, A., & Hill, C. (2008). Experiments in visual survey design for mobile devices. Paper presented at the The American Association for (AAPOR) 63rd Annual Conference.
Peytchev, A., Riley, S., Rosen, J., Murphy, J., & Lindblad, M. (2010). Reduction of nonresponse bias in surveys through case prioritization. Survey Research Methods, 4(1), 21–29.
Roster, C., Albaum, G., & Rogers, R. (2006). Can cross-national/cultural studies presume etic equivalency in respondents’ use of extreme categories of Likert rating scales? International Journal of Market Research, 48(6), 741–759.
Scherpenzeel, A. (2008). An online panel as a platform for multi-disciplinary research. In I. Stoop & M. Wittenberg (Eds.), Access panels and online research, panacea or pitfall? (pp. 101–106). Amsterdam: Aksant.
Schneid, M. (2004). Zum Einsatz stationärer Rechner, Notebooks und PDAs bei der Datenerhebung im Feld. Zeitschrift für Sozialpsychologie, 35(1), 3–13.
Schnell, R. (1997). Nonresponse in Bevölkerungsumfragen. Ausmaß, Entwicklung und Ursachen. Opladen: Leske + Budrich.
Schnell, R., Hill, P., & Esser, E. (2008). Methoden der empirischen Sozialforschung (8 ed.). München: Oldenbourg.
Schonlau, M., van Soest, A., Kapteyn, A., & Couper, M. P. (2009). Selection bias in Web surveys and the use of propensity scores. Sociological Methods Research, 37(3), 291–318.
Schröder, H., & Ganzeboom, H. B. G. (2009). Scaling education categories in european social survey. Paper presented at the Conference of the European Survey Research Association.
Schwarz, N., & Sudman, S. (Eds.). (1995). Context effects in social and psychological reserach. New York: Springer.
Shackman, G. (2001). Sample size and design effect. Paper presented at the Albany Chapter of American Statistical Association.
Shih, T.-H., & Fan, X. (2007). Response rates and mode preferences in web-mail mixed-mode surveys: a meta-analysis. International Journal of Internet Science, 2(1), 59–82.
Shlomo, N., Skinner, C., Schouten, B., Bethlehem, J., & Zhang, L.-C. (2009). Statistical properties of R-indicators: RISQ - Representativity Indicators for Survey Quality.
Shropshire, K. O., Hawdon, J. E., & Witte, J. C. (2009). Web survey design. Balancing measurement, response, and topical interest. Sociological Methods Research, 37(3), 344–370.
Singer, E., & Bossarte, R. M. (2006). Incentives for survey participation: When are they „coercive“? American Journal of Preventive Medicine, 31(5), 411–418.
Smyth, J. D., Dillman, D. A., Christian, L. M., & Mcbride, M. (2009). Openended questions in web surveys. Can increasing the size of answer boxes and providing extra verbal instructions improve response quality? Public Opinion Quarterly, 73(2), 325–337.
Stieger, S., & Reips, U. D. (2005). Dynamic Interviewing Program (DIP): Automatic online interviews via the instant messenger ICQ. Paper presented at the General Online Research Conference (GOR).
Stoop, I., Billiet, J., Koch, A., & Fitzgerald, R. (2010). Improving survey response. Lessons learned from the European Social Survey. New York: Wiley.
Sturgis, P. (2004). Analysing complex survey data: clustering, stratification and weights. Surrey: Department of Sociology at the University of Surrey.
Sudman, S., Bradburn, N., & Schwarz, N. (1996). Thinking about answers. The application of cognitive Processes to survey methodology. San Francisco: Jossey-Bass.
Tourangeau, R., Rips, L., & Rasinski, K. (2000). The psychology of survey response. Cambridge: Cambridge University Press.
Vehovar, V., & Zupanic, T. (2007). Weighting and nonresponse in European social survey (round 2). Paper presented at the Conference of the European Survey Research Association.
Virtanen, V., Sirkiä, T., & Jokiranta, V. (2007). Reducing nonresponse by SMS reminders in mail surveys. Social Science Computer Review, 25(3), 384–395.
Waksberg, J. (1978). Sampling methods for random digit dialling. Journal of the American Statistical Association, 73, 40–46.
Zhang, Y., Levinsohn, J., Olive, B., & Hill, C. (2008). Best practices for developing smart phone based web surveys and systems. Paper presented at the International Field Directors and Technology Conference.
Zhou, B., & McClendon, M. J. (1999): Cognitive ability and acquiescence. Paper presented at the 54th Annual Conference of the American Association for Public Opinion Research.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 VS Verlag für Sozialwissenschaften | Springer Fachmedien Wiesbaden GmbH
About this chapter
Cite this chapter
Fuchs, M. (2010). Herausforderungen der Umfrageforschung. In: Faulbaum, F., Wolf, C. (eds) Gesellschaftliche Entwicklungen im Spiegel der empirischen Sozialforschung. Schriftenreihe der ASI – Arbeitsgemeinschaft Sozialwissenschaftlicher Institute. VS Verlag für Sozialwissenschaften, Wiesbaden. https://doi.org/10.1007/978-3-531-92590-5_10
Download citation
DOI: https://doi.org/10.1007/978-3-531-92590-5_10
Publisher Name: VS Verlag für Sozialwissenschaften, Wiesbaden
Print ISBN: 978-3-531-17525-6
Online ISBN: 978-3-531-92590-5
eBook Packages: Humanities, Social Science (German Language)