Advertisement

Rationales Antwortverhalten als Ursache messbezogener Mode-Effekte im Zuge der Erfassung sensitiver Merkmale

Entwicklung eines theoretischen Bezugsrahmens
  • Heinz LeitgöbEmail author
Chapter
Part of the Schriftenreihe der ASI - Arbeitsgemeinschaft Sozialwissenschaftlicher Institute book series (SASI)

Zusammenfassung

Der Beitrag versucht auf Grundlage einer um modevariante Nutzen- und Kostenfaktoren angereicherten Fassung der Theorie des rationalen Befragtenverhaltens das Auftreten messbezogener Mode-Effekte im Rahmen der Erfassung sensitiver Merkmale zu erklären. Konkret liegt dem Ansatz die Annahme zugrunde, dass Survey Modes spezifische soziale Befragungssettings erzeugen, die über situative Nutzen- und Kostenfaktoren wiederum Einfluss auf die Entscheidung der Befragten nehmen, sensitive Informationen wahrheitsgemäß zu berichten, eine Falschangabe zu machen oder die Antwort zu verweigern. Hierbei spielen insbesondere jene Faktoren eine Rolle, die Art und Ausmaß der sozialer Interaktion im Rahmen der Beantwortung der Fragen determinieren. Über die entsprechende Ausgestaltung der modespezifischen Befragungssettings lassen sich unter Anwendung eines Kosten-Nutzen-Schemas schließlich empirisch prüfbare Hypothesen über systematische Differenzen in den Item Nonresponse-Raten und Messfehlervarianzen zwischen den Survey Modes ableiten. Diese dienen wiederum als Basis für die Vorhersage von Ausmaß und Richtung des messbezogenen Mode-Effekts auf Aggregatebene der zu vergleichenden Surveys.Zur Veranschaulichung werden die Ausführungen am Beispiel der Messung sexualdeliktspezifischer Viktimisierungserfahrungen konkretisiert.

Schlüsselwörter

Mode-Effekte Cognitive Model of Survey Response Rational Choice Theorie des rationalen Antwortverhaltens Messfehler Item Nonresponse Viktimisierungssurveys 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Literatur

  1. Auty, K. M., Farrington, D. P. & Coid, J. W. (2015). The validity of self-reported convictions in a community sample: findings from the Cambridge study in delinquent development. European Journal of Criminology, 12, 562–580.CrossRefGoogle Scholar
  2. Axinn, W. G. (1991). The influence on interviewer sex on responses to sensitive questions in Nepal. Social Science Research, 20, 303–318.CrossRefGoogle Scholar
  3. Bader, F., Bauer, J., Kroher, M. & Riordan, P. (2016). Privacy concerns in responses to sensitive questions. A survey experiment on the influence of numeric codes on unit nonresponse, item nonresponse, and misreporting. methods, data, analyses,10, 47–72.Google Scholar
  4. Baier, H. & Schulz, S. (2014). A free audio-CASI module for Lime Survey. Survey Methods: Insights from the Field. Online unter: https://surveyinsights.org/?p=5889 (31.05.2018).
  5. Beatty, P. & Herrmann, D. (2002). To answer or not to answer: decision processes related to survey item nonresponse. In R. M. Groves, D. A. Dillman, J. L. Eltinge & R. J. A. Little (Hrsg.), Survey Nonresponse (S. 71–86). New York: Wiley.Google Scholar
  6. Becker, R. (2006). Selective response to questions on delinquency. Quality & Quantity, 40, 483–498.CrossRefGoogle Scholar
  7. Becker, R. & Günther, R. (2004). Selektives Antwortverhalten bei Fragen zum delinquenten Handeln: Eine empirische Studie über die Wirksamkeit der,sealed envelope techniqueʻ bei selbst berichteter Delinquenz mit Daten des ALLBUS 2000. ZUMA Nachrichten, 54, 39–59.Google Scholar
  8. Bell, K., Fahmy, E. & Gordon, D. (2016). Quantitative conversations: the importance of developing rapport in standardized interviewing. Quality & Quantity, 50, 193–212.CrossRefGoogle Scholar
  9. Bethlehem, J. & Biffignandi, S. (2012). Handbook of Web Surveys. Hoboken: Wiley.Google Scholar
  10. Biemer, P. P. (2010). Total survey error. Design, implementation, and evaluation. Public Opinion Quarterly, 74, 817–848.CrossRefGoogle Scholar
  11. Biemer, P. P., de Leeuw, E., Eckman, S., Edwards, B., Kreuter, F., Lyberg, L. E., Tucker, N. C. & West, B. T. (Hrsg.) (2017). Total Survey Error in Practice. Wiley: Hoboken.Google Scholar
  12. Bowling (2005). Mode of questionnaire administration can have serious effects on data quality. Journal of Public Health 27, 281–291.CrossRefGoogle Scholar
  13. Braun, N. & Gautschi, T. (2011). Rational-Choice-Theorie. Weinheim: Juventa.Google Scholar
  14. Brauer, J. R. & Tittle, C. R. (2017). When crime is not an option: inspecting the moral filtering of criminal action alternatives. Justice Quarterly, 34, 818–846.CrossRefGoogle Scholar
  15. Campbell, R. & Raja, S. (1999). Secondary victimization of rape victims: insights from mental health professionals who treat survivors of violence. Violence & Victims, 14, 261–275.CrossRefGoogle Scholar
  16. Cantor, D. & Lynch, J. P. (2000). Self-report surveys as measures of crime and criminal victimization. Criminal Justice, 4, 85‒138.Google Scholar
  17. Capraro, V. (2013). A model of human cooperation in social dilemmas. PLOS ONE, 8, e72427.CrossRefGoogle Scholar
  18. Carley-Baxter, L. (2008). Respondent-interviewer rapport. In P. J. Lavrakas (Hrsg.), Encyclopedia of Survey Research Methods. Volume 2 (S. 743‒744). Thousand Oaks: Sage.Google Scholar
  19. Catania, J. A., Binson, D., Canchola, J., Pollack, L. M., Hauck, W. & Coates, T. J. (1996). Effects of interviewer gender, interviewer choice, and item wording on responses to questions concerning sexual behavior. Public Opinion Quarterly, 60, 345‒375.CrossRefGoogle Scholar
  20. Chaiken, S. & Trope, Y. (Hrsg). (1999). Dual-Process Theories in Social Psychology. New York: Guilford Press.Google Scholar
  21. Cleary, P. D., Mechanic, D. & Weiss, N. (1981). The effect of interviewer characteristics on responses to a mental health interview. Journal of Health & Social Behavior, 22, 183‒193.CrossRefGoogle Scholar
  22. Coleman, J. S. (1986). Social theory, social research, and a theory of action. American Journal of Sociology, 91, 1309‒1335.CrossRefGoogle Scholar
  23. Coleman, J. S. (1990). Foundations of Social Theory. Cambridge, MA: Harvard University Press.Google Scholar
  24. Collins, J., Hall, N. & Paul, L. A. (Hrsg.) (2004). Causation and Counterfactuals. Cambridge, MA: MIT Press.Google Scholar
  25. Cooper, J. & Worchel, S. (1970). Role of undesired consequences in arousing dissonance. Journal of Personality & Social Psychology, 16, 199‒206.Google Scholar
  26. Couper, M. P. (2011). The future of modes of data collection. Public Opinion Quarterly, 75, 889‒908.CrossRefGoogle Scholar
  27. Couper, M. P., Singer, E. & Tourangeau, R. (2004). Does voice matter? An interactive voice response (IVR) experiment. Journal of Official Statistics, 20, 551‒570.Google Scholar
  28. Davis, R. E., Couper, M. P., Janz, N. K, Caldwell, C. H. & Resnicow, K. (2010). Interviewer effects in public health surveys. Health Education Research, 25, 14‒26.CrossRefGoogle Scholar
  29. de Leeuw, E.D. (1992). Data Quality in Mail, Telephone and Face to Face Surveys. Amsterdam: TT-Publikaties.Google Scholar
  30. de Leeuw, E. D. (2008). Choosing the method of data collection. In E. D. de Leeuw, J. J. Hox & D. A. Dillman (Hrsg.), International Handbook of Survey Methodology (S. 113‒135). New York: Psychology Press, Taylor & Francis Group.Google Scholar
  31. de Leeuw, E.D. & van der Zouwen, J. (1988). Data quality in face-to-face and telephone surveys: a comparative meta-analysis. In R.M. Groves, P.P Biemer, L.E. Lyberg, J.T. Massey, W.L. Nicholls & J. Waksberg (Hrsg.), Telephone Survey Methodology (S. 283–299). New York: Wiley.Google Scholar
  32. DeMaio, T. J. (1984). Social desirability and survey measurement: a review. In C. F. Turner & E. Martin (Hrsg.), Surveying Subjective Phenomena, Vol. 2 (S. 257–281). New York: Russel Sage.Google Scholar
  33. Diekmann, A. (1980). Die Befolgung von Gesetzen. Empirische Untersuchungen zu einer rechtssoziologischen Theorie. Berlin: Duncker & Humbolt.Google Scholar
  34. Diekmann, A. & Lindenberg, S. (2015). Cooperation: sociological aspects. In N. J. Smelser & P. B. Baltes (Hrsg.), International Encyclopedia of the Social and Behavioral Sciences, Vol. 4 (S. 862–866). Oxford: Elsevier. (2. Auflage).Google Scholar
  35. Diekmann, A. & Voss, T. (2004). Die Theorie rationalen Handelns. Stand und Perspektiven. In A. Diekmann & T. Voss (Hrsg.), Rational-Choice-Theorie in den Sozialwissenschaften. Anwendungen und Probleme (S. 13–29).Google Scholar
  36. Eckman, S., Kreuter, F., Kirchner, A., Jäckle, A., Tourangeau, R. & Presser, S. (2014). Assessing the mechanisms of misreporting to filter questions in surveys. Public Opinion Quarterly, 78, 721–733.CrossRefGoogle Scholar
  37. Engel, U. & Schmidt, B. O. (2014). Unit- und Item-Nonresponse. In N. Baur & J. Blasius (Hrsg.), Handbuch Methoden der empirischen Sozialforschung (S. 331–348). Wiesbaden: VS Verlag.Google Scholar
  38. Esser, H. (1986). Können Befragte lügen? Zum Konzept des „wahren Wertes“ im Rahmen der handlungstheoretischen Erklärung von Situationseinflüssen bei der Befragung. Kölner Zeitschrift für Soziologie & Sozialpsychologie, 38, 314–336.Google Scholar
  39. Esser, H. (1990). „Habits“, „Frames“ und „Rational Choice“. Die Reichweite von Theorien der rationalen Wahl (am Beispiel der Erklärung des Befragtenverhaltens). Zeitschrift für Soziologie, 19, 231–247.Google Scholar
  40. Esser, H. (1991). Die Erklärung systematischer Fehler in Interviews: Befragtenverhalten als „rational choice“. In R. Wittenberg (Hrsg.), Person – Situation – Institution – Kultur. Günter Büschges zum 65. Geburtstag (S. 59–78). Berlin: Duncker & Humblot.Google Scholar
  41. Esser, H. (1993). Soziologie. Allgemeine Grundlagen. Frankfurt am Main: Campus.Google Scholar
  42. Esser, H. (1999). Soziologie. Spezielle Grundlagen. Band 1: Situationslogik und Handeln. Frankfurt am Main: Campus.Google Scholar
  43. Esser, H. (2001). Soziologie. Spezielle Grundlagen. Band 6: Sinn und Kultur. Frankfurt am Main: Campus.Google Scholar
  44. Festinger, L. (1957). A Theory of cognitive dissonance. Stanford: Stanford University Press.Google Scholar
  45. Festinger, L. & Carlsmith, J. M. (1959). Cognitive consequences of forced compliance. Journal of Abnormal & Social Psychology, 58, 203–210.CrossRefGoogle Scholar
  46. Flores-Macias, F. & Lawson, C. (2008). Effects of interviewer gender on survey responses: findings from a household survey in Mexico. International Journal of Public Opinion Research, 20, 100–110.CrossRefGoogle Scholar
  47. Fowler, F. J. (1995). Improving Survey Questions: Design and Evaluation. Thousand Oaks: Sage.Google Scholar
  48. Fowler, F. J. & Mangione, T. W. (1990). Standardized Survey Interviewing. Minimizing Interviewer-Related Error. Newbury Park: Sage.Google Scholar
  49. Garbarski, D., Schaeffer, N. C. & Dykema, J. (2016). Interviewing practices, conversational practices, and rapport: responsiveness and engagement in the survey interview. Sociological Methodology, 46, 1–38.CrossRefGoogle Scholar
  50. Galla, J. P., Frisone, J. D., Jeffrey, L. R. & Gaer, E. P. (1981). Effect of experimenter’s gender on responses to a sex-role attitude questionnaire. Psychological Reports, 49, 935–940.CrossRefGoogle Scholar
  51. Gigerenzer, G. & Gaissmaier, W. (2011). Heuristic decision making. Annual Review of Psychology, 62, 451–482.CrossRefGoogle Scholar
  52. Gilboa, I. (2010). Rational Choice. Cambridge, MA: The MIT Press.Google Scholar
  53. Greve, J., Schnabel, A. & Schützeichel, R. (Hrsg.) (2008). Das Makro-Mikro-Makro-Modell der soziologischen Erklärung. Zur Ontologie, Methodologie und Metatheorie eines Forschungsprogramms. Wiesbaden: VS Verlag.Google Scholar
  54. Groves, R. M., Fowler, F. J., Couper, M. P., Lepkowski, J. M., Singer, E. & Tourangeau, R. (2009). Survey Methodology. Hoboken: Wiley.Google Scholar
  55. Groves, R. M. & Fultz, N. H. (1985). Gender effects among telephone interviewers in a survey of economic attitudes. Sociological Methods & Research, 14, 31–52.CrossRefGoogle Scholar
  56. Groves, R. M. & Lyberg, L. E. (2010). Total survey error. Past, present, and future. Public Opinion Quarterly, 74, 849–879.CrossRefGoogle Scholar
  57. Guzy, N., Birkel C. & Mischkowitz, R. (Hrsg) (2015a). Viktimisierungsbefragungen in Deutschland: Band 1: Ziele, Nutzen und Forschungsstand. Wiesbaden: BKA.Google Scholar
  58. Guzy, N., Birkel C. & Mischkowitz, R. (Hrsg) (2015b). Viktimisierungsbefragungen in Deutschland: Band 2: Methodik und Methodologie. Wiesbaden: BKA.Google Scholar
  59. Guzy, N. & Leitgöb, H. (2015). Assessing mode effects in online and telephone victimization surveys. International Review of Victimology, 21, 101–131.CrossRefGoogle Scholar
  60. Hardy, M & Reynolds, J. (2009). Incorporating categorical information into regression models: the utility of dummy variables. In M. Hardy & A. Bryman (Hrsg.), The Handbook of Data Analysis. London: Sage.Google Scholar
  61. Houtkoop-Steenstra, H. (2000). Interaction and the Standardized Survey Interview. The Living Questionnaire. Cambridge: Cambridge University Press.Google Scholar
  62. Holbrook, A. L., Green, M. C. & Krosnick, J. A. (2003). Telephone versus face-to-face-interviewing of national probability samples with long questionnaires: comparisons of respondent satisficing and social desirability response bias. Public Opinion Quarterly, 67, 79–125.CrossRefGoogle Scholar
  63. Huddy, L., Billig, J., Bracciodieta, J., Hoeffler, L., Moynihan, P. J. & Pugliani, P. (1997). The effect of interviewer gender on the survey response. Political Behavior, 19, 197–220.CrossRefGoogle Scholar
  64. Huizinga, D. & Elliott, D. S. (1986). Reassessing the reliability and validity of self-reported delinquency measures. Journal of Quantitative Criminology, 2, 293–327.Google Scholar
  65. Jann, B. & Przepiorka, W. (Hrsg.) (2017). Social Dilemmas, Institutions, and the Evolution of Cooperation. Berlin: De Gruyter.CrossRefGoogle Scholar
  66. Jans, M. (2008). Mode effects. In P.J. Lavrakas (Hrsg.), Encyclopedia of Research Methods. Volume 1 (S. 475–480). Thousand Oaks: Sage.Google Scholar
  67. Jäckle, A., Roberts, C. & Lynn, P. (2010). Assessing the effect of data collection mode on measurement. International Statistical Review 78, 3–20.CrossRefGoogle Scholar
  68. Klausch, T., Hox, J.J. & Schouten, B. (2013). Measurement effects of survey mode on the equivalence of attitudinal rating scale questions. Sociological Methods & Research, 42, 227–263.CrossRefGoogle Scholar
  69. Kollock, P. (1998). Social dilemmas: the anatomy of cooperation. Annual Review of Sociology, 24, 183–214.CrossRefGoogle Scholar
  70. Kraft, M., Landes, T. & Braun, K. (1992). Statistische Methoden. Eine Einführung für das Grundstudium in den Wirtschafts- und Sozialwissenschaften. Berlin: Springer. (2. Auflage).Google Scholar
  71. Kroneberg, C. (2005). Die Definition der Situation und die variable Rationalität der Akteure. Ein allgemeines Modell des Handelns. Zeitschrift für Soziologie, 34, 344–363.Google Scholar
  72. Kroneberg, C. (2011). Erklärung sozialen Handelns. Grundlagen und Anwendung einer integrativen Theorie. Wiesbaden: VS Verlag.Google Scholar
  73. Kroneberg, C. (2014). Frames, scripts, and variable rationality: an integrative theory of action. In G. Manzo (Hrsg.), Analytical Sociology: Actions and Networks (S. 97–123). Hoboken: Wiley.CrossRefGoogle Scholar
  74. Kroneberg, C. & Kalter, F. (2012). Rational choice theory and empirical research: methodological and theoretical contributions in Europe. Annual Review of Sociology, 38, 73–92.CrossRefGoogle Scholar
  75. Krosnick, J.A.(1991).Response strategies for coping with the cognitive demands of attitudes measures in surveys.Applied Cognitive Psychology, 5, 213–236.CrossRefGoogle Scholar
  76. Krosnick, J.A.(1999).Survey research.Annual Review of Psychology, 50, 537–567.CrossRefGoogle Scholar
  77. Krosnick, J.A.& Alwin, D.F.(1987).An evaluation of a cognitive theory of response-order effects in survey measurement.Public Opinion Quarterly, 51, 201–219.CrossRefGoogle Scholar
  78. Krosnick, J.A., Narayan, S.& Smith, W.R.(1996).Satisficing in surveys: initial evidence.Evaluative Thinking, 70, 29–44.CrossRefGoogle Scholar
  79. Krumpal, I.(2013).Determinants of social desirability bias in sensitive surveys: a literature review.Quality & Quantity, 47, 2025–2047.CrossRefGoogle Scholar
  80. Kury, H., Guzy, N.& Leitgöb, H.(2015).Effekte des Erhebungsmodus.In N.Guzy, C.Birkel & R.Mischkowitz (Hrsg.), Viktimisierungsbefragungen in Deutschland. Band 2: Methodik und Methodologie (S.77–105).Wiesbaden: BKA.Google Scholar
  81. Landis, J., Sullivan, D.& Sheley, J.(1973).Feminist attitudes as related to sex of the interviewer.Pacific Sociological Review, 16, 305–314.CrossRefGoogle Scholar
  82. Lavin, D.& Maynard, D.W.(2001).Standardization vs.rapport: respondent laughter and interviewer reaction during telephones surveys.American Sociological Review, 66, 453–479.CrossRefGoogle Scholar
  83. Lee, R.M.(1993).Doing Research on Sensitive Topics.London: Sage.Google Scholar
  84. Lee, R.M.& Renzetti, C.M.(1990).The problems of researching sensitive topics.An overview and introduction.American Behavioral Scientist, 33, 510–528.CrossRefGoogle Scholar
  85. Leitgöb, H.(2017).Ein Verfahren zur Dekomposition von Mode-Effekten in eine mess- und eine repräsentationsbezogene Komponente.In S.Eifler & F.Faulbaum (Hrsg.), Methodische Probleme von Mixed-Mode-Ansätzen in der Umfrageforschung (S.51–95).Wiesbaden.VS Verlag.Google Scholar
  86. Lensvelt-Mulders, G.(2008).Surveying sensitive topics.In E.D.de Leeuw, J.J.Hox & D.A.Dillman (Hrsg.), International Handbook of Survey Methodology (S.461–478).New York: Psychology Press, Taylor & Francis Group.Google Scholar
  87. Lewis, D.K.(1973a).Counterfactuals.Cambridge, MA: Harvard University Press.Google Scholar
  88. Lewis, D.K.(1973b).Causation.The Journal of Philosophy, 70, 556–567.CrossRefGoogle Scholar
  89. Lipps, O.& Lutz, G.(2017).Gender of interviewer effects in a multi-topic centralized CATI panel survey.methods, data & analyses, 11, 67–86.Google Scholar
  90. Liu, M.& Stainback, K.(2013).Interviewer gender effects on survey responses to marriage-related questions.Public Opinion Quarterly, 77, 606–618.CrossRefGoogle Scholar
  91. Lueptow, L.B., Moser, S.L.& Pendleton, B.F.(1990).Gender and response effects in telephones interviews about gender characteristics.Sex Roles, 22, 29–42.CrossRefGoogle Scholar
  92. Lugtig, P., Lensvelt-Mulders, G.J.L.M., Frerichs, R.& Greven, A.(2011).Estimating nonresponse bias and mode-effects in a mixed-mode survey.International Journal of Market Research, 53, 669–686.CrossRefGoogle Scholar
  93. Lyberg, L.E.& Kasprzyk, D.(2004).Data collection methods and measurement error: an overview.In P.P.Biemer, R.M.Groves, L.E.Lyberg, N.A.Mathiowetz & S.Sudman (Hrsg.), Measurement Error in Surveys (S.237–258).Hoboken: Wiley.Google Scholar
  94. Maynard, D.W., Houtkoop-Steenstra, H., Schaeffer, N.C.& van der Zouwen, J.(Hrsg.) (2002).Standardization and Tacit Knowledge. Interaction and Practice in the Survey Interview.New York: Wiley.Google Scholar
  95. Murdoch, M., Simon, A.B., Polusny, M.A., Bangerter, A.K., Grill, J.P., Noorbaloochi, S.& Partin, M.R.(2014).Impact of different privacy conditions and incentives on survey response rate, participant representativeness, and disclosure of sensitive information: a randomized controlled trial.BMC Medical Research Methodology, 14, 90.Google Scholar
  96. Musch, J., Brockhaus, R.& Bröder, A.(2002).Ein Inventar zur Erfassung von zwei Faktoren sozialer Erwünschtheit.Diagnostica, 48, 121–129.CrossRefGoogle Scholar
  97. Neuman, W.L.(2012).Designing the face-2-face survey.In L.Gideon (Hrsg.), Handbook of Survey Methodology for the Social Sciences (S.227–248).New York: Springer.CrossRefGoogle Scholar
  98. Ongena, Y.P.& Dijkstra, W.(2007).A model of cognitive processes and conversational principles in survey interview interaction.Applied Cognitive Psychology, 21, 145–163.CrossRefGoogle Scholar
  99. Opp, K.-D.(1999).Contending conceptions of the theory of rational action.Journal of theoretical Politics, 11, 171–202.CrossRefGoogle Scholar
  100. Opp, K.-D.(2015).Norms.In N.J.Smelser & P.B.Baltes (Hrsg.), International Encyclopedia of the Social and Behavioral Sciences, Vol. 17 (S.5–10).Oxford: Elsevier.(2.Auflage).Google Scholar
  101. Opp, K.-D.(2013).Norms and rationality.Is moral behavior a form of rational action? Theory & Decision, 74, 383–409.CrossRefGoogle Scholar
  102. Paulhus, D.L.(1984).Two-component models of socially desirable responding.Journal of Personality & Social Psychology, 46, 598–609.CrossRefGoogle Scholar
  103. Paulhus, D.L.(1986).Self-deception and impression management in test responses.In A.Angleitner & J.S Wiggins (Hrsg.), Personality Assessment via Questionnaire (S.142–165).New York: Springer.CrossRefGoogle Scholar
  104. Paulhus, D.L.(1989).Socially desirable responding: some new solutions to old problems.In D.M.Buss & N.Cantor (Hrsg.), Personality Psychology. Recent Trends and Emerging Directions (S.201–209).New York: Springer.CrossRefGoogle Scholar
  105. Paulhus, D.L.(2002).Socially desirable responding: the evolution of a construct.In H.I.Braun, D.N.Jackson & D.E.Wiley (Hrsg.), The Role of Psychological and Educational Measurement (S.49–69).Mahwah: Lawrence Erlbaum Associates.Google Scholar
  106. Phillips, D.L.(1971).Knowledge from What? Theories and Methods in Social Research.Chicago: Rand McNally & Company.Google Scholar
  107. Phillips, D.L.(1973).Abandoning Method.Sociological Studies in Methodology.San Francisco: Jossey-Bass.Google Scholar
  108. Rasinski, K.A., Baldwin, A.K, Willis, G.B.& J.B.(1994).Risk and loss perceptions associated with survey reporting of sensitive behaviors.National Center for Health Statistics (NORC) Chicago, 497–502.Google Scholar
  109. Rasinski, K.A., Willis, G.B., Baldwin, A.K., Yeh, W.& Lee, L.(1999).Methods of data collection, perceptions of risks and losses, and motivation to give truthful answers to sensitive survey questions.Applied Cognitive Psychology, 13, 465–484.CrossRefGoogle Scholar
  110. Raub, W., Buskens, V.& van Assen, M.A.L.M.(2011).Micro-macro links and microfundations in sociology.Journal of Mathematical Sociology, 35, 1–25.Google Scholar
  111. Raub, W.& Voss, T.(2017).Micro-macro-models in sociology: antecedents of Coleman’s diagram.In B.Jann & W.Przepiorka (Hrsg.), Social Dilemmas, Institutions, and the Evolution of Cooperation (S.11–36).Berlin: De Gruyter.Google Scholar
  112. Rauhut, H.& Krumpal, I.(2008).Die Durchsetzung sozialer Normen in Low-Cost und High-Cost Situationen.Zeitschrift für Soziologie, 37, 380–402.Google Scholar
  113. Sakshaug, J.W., Yan, T.& Tourangeau, R.(2010).Nonresponse error, measurement error, and mode of data collection.Tradeoffs in a multi-mode survey of sensitive and non-sensitive items.Public Opinion Quarterly, 74, 907–933.CrossRefGoogle Scholar
  114. Savage, L.J.(1954).The Foundations of Statistics.New York: Wiley.Google Scholar
  115. Schaeffer, N.C., Dykema, J.& Maynard, D.W.(2010).Interviews and interviewing.In P.V.Marsden & J.Wright (Hrsg.), Handbook of Survey Research (S.437–470).Howard House: Emerald Group Publishing Limited.Google Scholar
  116. Schaeffer, N.C.& Maynard, D.W.(1996).From paradigm to prototype and back again: interactive aspects of cognitive processing in survey interviews.In N.E.Schwarz & S.Sudman (Hrsg.), Answering Questions: Methodology for Determining Cognitive and Communicative Processes in Survey Research (S.65–88).San Francisco: Jossey-Bass.Google Scholar
  117. Schaeffer, N.C.& Maynard, D.W.(2008).The contemporary standardized survey interview for social research.In F.G.Conrad & M.F.Schober (Hrsg.), Envisioning the Survey Interview of the Future (S.31–57).Hoboken: Wiley.Google Scholar
  118. Schnell, R.(1997).Nonresponse in Bevölkerungsumfragen.Opladen: Leske + Budrich.CrossRefGoogle Scholar
  119. Schulz, S.& Kroneberg, C.(im Erscheinen).Die Vielfalt kriminellen Handelns.Zum Anwendungspotential des Modells der Frame-Selektion in der Kriminologie.Monatsschrift für Kriminologie & Strafrechtsreform.Google Scholar
  120. Schwarz, N.(1999).Self-reports.How the questions shape the answers.American Psychologist, 54, 93–105.Google Scholar
  121. Schwarz, N., Knäuper, B., Oyserman, D.& Stich, C.(2008).The psychology of asking questions.In E.de Leeuw, J.Hox & D.A.Dillman (Hrsg.), International Handbook of Survey Methodology (S.18–34).New York: Psychology Press.Google Scholar
  122. Schwarz, N., Strack, F., Hippler, H.-J.& Bishop, G.(1991).The impact of administration mode on response effects in survey measurement.Applied Cognitive Psychology 5, 193–212.CrossRefGoogle Scholar
  123. Sherman, J.W, Gawronski, B.& Trope, Y.(Hrsg.) (2014).Dual-Process Theories of the Social Mind.New York: Guilford Press.Google Scholar
  124. Sibley, M.H., Pelham, W.E., Molina, B.S.G., Waschbusch, D.A., Gnagy, E.M., Babinski, D.E.& Biswas, A.(2010).Inconsistent self-report of delinquency by adolescents and young adults with ADHD.Journal of Abnormal Child Psychology, 38, 645–656.CrossRefGoogle Scholar
  125. Simon, H.A.(1957).Models of Man. Social and Rational.New York: Wiley.Google Scholar
  126. Simonson, Julia (2009).Klassenzimmerbefragungen von Kindern und Jugendlichen: Praktikabilität, Potentiale und Probleme der Methode.In M.Weichbold, J.Bacher & C.Wolf (Hrsg.), Umfrageforschung. Herausforderungen und Grenzen (S.63–84).Wiesbaden: VS Verlag.Google Scholar
  127. Sirken, M.Willis, G.B.& Nathan, G.(1991).Cognitive aspects of answering sensitive survey questions.Bulletin of the International Statistics Institute, 48, 628–629.Google Scholar
  128. Smith, T.(2011).Refining the total survey error perspective.International Journal of Public Opinion Research, 23, 464–484.CrossRefGoogle Scholar
  129. Steenkamp, J.-B.E.M., de Jong, M.G.& Baumgartner, H.(2010).Socially desirable response tendencies in survey research.Journal of Marketing Research, 47, 199–214.CrossRefGoogle Scholar
  130. Steiger, D.M.& Conroy, B.(2008).IVR: Interactive voice response.In E.de Leeuw, J.Hox & D.A.Dillman (Hrsg.), International Handbook of Survey Methodology (S.288–298).New York: Psychology Press.Google Scholar
  131. Stocké, V.(2002).Framing und Rationalität.Die Bedeutung der Informationsdarstellung für das Entscheidungsverhalten.München: Oldenbourg.Google Scholar
  132. Stocké, V.(2004).Entstehungsbedingungen von Antwortverzerrungen durch soziale Erwünschtheit.Ein Vergleich der Prognosen der Rational-Choice Theorie und des Modells der Frame-Selektion.Zeitschrift für Soziologie, 33, 303–320.Google Scholar
  133. Stocké, V.(2007a).Determinants and consequences of survey respondents’ social desirability beliefs about racial attitudes.Methodology, 3, 125–138.Google Scholar
  134. Stocké, V.(2007b).The interdependence of determinants for the strength and direction of social desirability bias in racial attitude surveys.Journal of Official Statistics, 23, 493–514.Google Scholar
  135. Stocké, V.& Langfeldt, B.(2004).Effects of survey experience on respondents’ attitudes towards surveys.Bulletin de méthodologie sociologique, 81, 5–32.CrossRefGoogle Scholar
  136. Stocké, V.& Stark, T.(2006).Trust in surveys and the respondents’ susceptibility to item nonresponse.Sonderforschungsbereich 504: Rationalitätskonzepte, Entscheidungsverhalten und ökonomische Modellierung, Working Paper Nr.06-06.Online unter: https://ub-madoc.bib.uni-mannheim.de/2601 (31.05.2018).
  137. Stone, J.& Cooper, J.(2001).A self-standards model of cognitive dissonance.Journal of Experimental Social Psychology, 37, 228–243.CrossRefGoogle Scholar
  138. Sudman, S.& Bradburn, N.M.(1974).Response Effects in Surveys: A Review and Synthesis.Chicago: Aldine Publishing Company.Google Scholar
  139. Sun, H.(2014).Rapport and Its Impact on the Disclosure of Sensitive Information in Standardized Interviews.Online unter: https://drum.lib.umd.edu/handle/1903/16274 (31.05.2018).
  140. Sykes, G.M.& Matza, D.(1957).Techniques of neutralization: a theory of delinquency.American Sociological Review, 22, 664–670.CrossRefGoogle Scholar
  141. Thornberry, T.P.& Krohn, M.D.(2000).The self-report method for measuring delinquency and crime.Criminal Justice, 4, 33–83.Google Scholar
  142. Tourangeau, R.(1984).Cognitive sciences and survey methods.In T.Jabine, M.Straf, J.Tanur & R.Tourangeau (Hrsg.), Cognitive Aspects of Survey Methodology: Building a Bridge between Disciplines (S.73–100).Washington, DC: National Academy Press.Google Scholar
  143. Tourangeau, R.(2018).The survey response process from a cognitive viewpoint.Quality Assurance in Education, 26, 169–181.CrossRefGoogle Scholar
  144. Tourangeau, R.& Bradburn, N.M.(2010).The psychology of survey response.In P.M.Marsden & J.D.Wright (Hrsg.), Handbook of Survey Research (S.315–346).Howard House: Emerald.Google Scholar
  145. Tourangeau, R., Kreuter, F.& Eckman S.(2015).Motivated misreporting: shaping answers to reduce survey burden.In U.Engel (Hrsg.), Survey Measurements. Techniques, Data Quality and Sources of Error (S.24–41).Google Scholar
  146. Tourangeau, R.& McNeeley, M.E.(2003).Measuring crime and crime victimization: methodological issues.In J.V.Pepper & C.V.Petrie (Hrsg.), Measurement Problems in Criminal Justice Research (S.10–42).Washington, DC: The National Academies Press.Google Scholar
  147. Tourangeau, R.& Rasinski, K.A.(1988).Cognitive processes underlying context effects in attitude measurement.Psychological Bulletin, 103, 299–314.CrossRefGoogle Scholar
  148. Tourangeau, R., Rips, L.J.& Rasinski, K.A.(2000).The Psychology of Survey Response.Cam-bridge: Cambridge University Press.Google Scholar
  149. Tourangeau, R.& Smith, T.W.(1996).Asking sensitive questions.The impact of data collection mode, question format, and question context.Public Opinion Quarterly, 60, 275–304.CrossRefGoogle Scholar
  150. Tourangeau, R., Steiger, D.M.& Wilson D.(2002).Self-administered questions by telephone.Evaluating interactive voice response.Public Opinion Quarterly, 66, 265–278.CrossRefGoogle Scholar
  151. Tourangeau, R.& Yan, T.(2007).Sensitive questions in surveys.Psychological Bulletin, 133, 859–883.CrossRefGoogle Scholar
  152. Trasler, G.(1993).Conscience, opportunity, rational choice, and crime.In R.V.Clarke & M.Felson (Hrsg.), Routine Activity and Rational Choice (S.305–322).Brunswick: Transaction Publishers.Google Scholar
  153. Tu, S.H.& Liao, P.S.(2007).Social distance, respondent cooperation and item nonresponse in sex survey.Quality & Quantity, 41, 177–199.CrossRefGoogle Scholar
  154. Tucker, C.(1983).Interviewer effects in telephone surveys.Public Opinion Quarterly, 47, 84–95.CrossRefGoogle Scholar
  155. Tyler, T.R.(2011).Why People Cooperate. The Role of Social Motivations.Princeton: Princeton University Press.Google Scholar
  156. Udéhn, L.(2001).Methodological Individualism. Background, History and Meaning.London: Routledge.CrossRefGoogle Scholar
  157. Udéhn, L.(2002).The changing face of methodological individualism.Annual Review of Sociology, 28, 479–507.CrossRefGoogle Scholar
  158. Ullman, S.E.(1996).Social reactions, coping strategies, and self-blame attributions in adjustment to sexual assault.Psychology of Women Quarterly, 20, 505–526.CrossRefGoogle Scholar
  159. Vannieuwenhuyze, J.T.A.& Loosvelt, G.(2012).Evaluating relative mode effects in mixed-mode surveys: three methods to disentangle selection and measurement effects.Sociological Methods & Research, 42, 82–104.CrossRefGoogle Scholar
  160. Vannieuwenhuyze, J.T.A., Loosvelt, G.& Molenberghs, G.(2010).A method for evaluating mode effects in mixed-mode surveys.Public Opinion Quarterly, 74, 1027–1045.CrossRefGoogle Scholar
  161. Vannieuwenhuyze, J.T.A., Loosvelt, G.& Molenberghs, G.(2014).Evaluating mode effects in mixed-mode survey data using covariate adjustment models.Journal of Official Statistics, 30, 1–21.CrossRefGoogle Scholar
  162. Vercruyssen, A., Wuyts, C.& Loosveldt, G.(2017).The effect of sociodemographic (mis)match between interviewers and respondents on unit and item nonresponse in Belgium.Social Science Research, 67, 229–238.CrossRefGoogle Scholar
  163. von Neumann, J.& Morgenstern, O.(1944).Theory Games and Economic Behavior.Princeton: Princeton University Press.Google Scholar
  164. Weisburd, H.F.(2005).The Total Survey Error Approach. A Guide to the New Science of Survey Research.Chicago: University of Chicago Press.Google Scholar
  165. West, B.T.& Blom, A.G.(2017).Explaining interviewer effects: a research synthesis.Journal of Survey Statistics & Methodology, 5, 175–211.Google Scholar
  166. Wikström, P.-O.(2010).Explaining crime as moral action.In S.Hitlin & S.Vaisey (Hrsg.), Handbook of the Sociology of Morality (S.211–239).New York: Springer.CrossRefGoogle Scholar
  167. Wikström, P.-O.(2013).Why crime happens: a situational action theory.In G.Manzo (Hrsg.), Analytical Sociology. Actions and Networks (S.71–94).Chichester: Wiley.CrossRefGoogle Scholar
  168. Willis, G., Rasinski, K.& Baldwin, A.(1998).Cognitive on Responses to Sensitive Survey Questions (Working Paper Series, No.24).Hyattsville: National Center for Health Statistics, Cognitive Methods Staff.Google Scholar
  169. Willis, G., Sirken, M.& Nathan, G.(1994).The Cognitive Aspects of Responses to Sensitive Survey Questions (Working Paper Series, No.9).Hyattsville: National Center for Health Statistics, Cognitive Methods Staff.Google Scholar
  170. Wolter, F.(2012).Heikle Fragen in Interviews. Eine Validierung der Randomized Response-Technik.Wiesbaden: VS Verlag.CrossRefGoogle Scholar
  171. Zick, A., Küpper, B.& Krause, D.(Hrsg) (2016).Gespaltene Mitte – Feindselige Zustände.Rechtsextreme Einstellungen in Deutschland 2016.Bonn: Dietz-Verlag.Google Scholar

Copyright information

© Springer Fachmedien Wiesbaden GmbH, ein Teil von Springer Nature 2019

Authors and Affiliations

  1. 1.EichstättDeutschland

Personalised recommendations