Quality & Quantity

, Volume 48, Issue 1, pp 127–148 | Cite as

Item comparability in cross-national surveys: results from asking probing questions in cross-national web surveys about attitudes towards civil disobedience

  • Dorothée Behr
  • Michael Braun
  • Lars Kaczmirek
  • Wolfgang Bandilla
Open Access


This article focuses on assessing item comparability in cross-national surveys by asking probing questions in Web surveys. The “civil disobedience” item from the “rights in a democracy” scale of the International Social Survey Program (ISSP) serves as a substantive case study. Identical Web surveys were fielded in Canada (English-speaking), Denmark, Germany, Hungary, Spain, and the U.S. A category-selection and a comprehension probe, respectively, were incorporated into the Web surveys after the closed-ended “civil disobedience” item. Responses to the category selection-probe reveal that notably in Germany, Hungary, and Spain the detachment of politicians from the people and their lack of responsiveness is deplored. Responses to the comprehension probe show that mainly in the U.S. and Canada violence and/or destruction are associated with civil disobedience. These results suggest reasons for the peculiar statistical results found for the “civil disobedience” item in the ISSP study. On the whole, Web probing proves to be a valuable tool for identifying interpretation differences and potential bias in cross-national survey research.


Probing Web surveys Mixed method Comparability Cross-national survey research 



This work was supported by the German Research Foundation (DFG) as part of the PPSM Priority Programme on Survey Methodology (SPP 1292) [project # 574187].

Open Access

This article is distributed under the terms of the Creative Commons Attribution License which permits any use, distribution, and reproduction in any medium, provided the original author(s) and the source are credited.


  1. Beatty P.C., Willis G.B.: Research synthesis: the practice of cognitive interviewing. Public Opin. Quart. 71, 287–311 (2007)CrossRefGoogle Scholar
  2. Behr, D., Kaczmirek, L., Bandilla, W., Braun, M.: Asking probing questions in web surveys: which factors have an impact on the quality of responses? Soc. Sci. Comput. Rev. (2012). doi: 10.1177/0894439311435305
  3. Behr, D., Braun, M., Kaczmirek, L., Bandilla, W.: Testing the validity of gender ideology items by implementing probing questions in web surveys. Field Methods (Forthcoming)Google Scholar
  4. Bolzendahl, C., Coffe, H.: A gender gap in citizenship norms? The importance of political, civil and social rights and responsibilities. Br. J. Sociol. 60(4), 763–791. (2008). Accessed 24 April 2012
  5. Childs J., Goerman P.: Bilingual questionnaire evaluation and development through mixed pretesting methods: the case of the U.S. census nonresponse followup instrument. J. Off. Stat. 26, 535–557 (2010)Google Scholar
  6. DeMaio T.J., Rothgeb J.M.: Cognitive interviewing techniques: in the lab and in the field. In: Schwarz, N., Sudman, S. (eds) Answering Questions: Methodology for Determining Cognitive and Communicative Processes in Survey Research, pp. 177–195. Jossey-Bass, San Francisco (1996)Google Scholar
  7. Donovan, T., Denemark, D., Bowler, S.: Trust in government. The United States in comparative perspective. (2008). Accessed 24 April 2012
  8. Fitzgerald R., Widdop S., Gray M., Collins D.: Identifying sources of error in cross-national questionnaires: application of an error source typology to cognitive interview data. J. Off. Stat. 27, 569–599 (2011)Google Scholar
  9. Forsyth B.H., Stapleton Kudela M., Levin K., Lawrence D., Willis G.B.: Methods for translating an English-language survey questionnaire on tobacco use into Mandarin, Cantonese, Korean, and Vietnamese. Field Methods 19, 264–283 (2007)CrossRefGoogle Scholar
  10. Goerman P.L., Caspar R.A.: A preferred approach for the cognitive testing of translated materials: testing the source version as a basis for comparison. Int. J. Soc. Res. Methodol. 13, 303–316 (2010)CrossRefGoogle Scholar
  11. Harkness J.A., Edwards B., Hansen S.E., Miller D.R., Villar A.: Designing questionnaires for multipopulation research. In: Harkness, J.A., Braun, M., Edwards, B., Johnson, T.P., Lyberg, L., Mohler, P.P., Pennell, B.-E., Smith, T.W. (eds) Survey Methods in Multinational, Multiregional, and Multicultural Contexts., pp. 33–57. Wiley, Hoboken (2010)CrossRefGoogle Scholar
  12. ISSP Research Group: International social survey programme 2004: citizenship (ISSP 2004). Source questionnaire. GESIS Data Archive, Cologne (2004a)Google Scholar
  13. ISSP Research Group: International social survey programme 2004: citizenship (ISSP 2004). Data file Vers. 1.2.0. GESIS Data Archive, Cologne. doi: 10.4232/1.10078 (2004b)
  14. Lee, J.: Conducting cognitive interviews in cross-national settings. Assessment (2012). doi: 10.1177/1073191112436671
  15. Miller K., Fitzgerald R., Padilla J.-L., Willson S., Widdop S., Caspar R., Dimov M., Gray M., Nunes C., Prüfer P., Schöbi N., Schoua-Glusberg A.: Design and analysis of cognitive interviews for comparative multinational testing. Field Methods 23, 379–396 (2011)CrossRefGoogle Scholar
  16. Morren, M., Gelissen, J., Vermunt, J. K.: Exploring the response process of culturally differing survey respondents with a response style: a sequential mixed-methods study. Field Methods (Forthcoming)Google Scholar
  17. Prüfer, P., Rexroth, M.: Kognitive Interviews [cognitive interviews]. ZUMA How-to-Reihe 15. (2005). Accessed 24 April 2012
  18. Reeve B.B., Willis G., Shariff-Marco S.N., Breen N., Williams D.R., Gee G.C., Alegría M., Takeuchi D.T., Stapleton M., Levin K.Y.: Comparing cognitive interviewing and psychometric methods to evaluate a racial/ethnic discrimination scale. Field Methods 34, 397–419 (2011)CrossRefGoogle Scholar
  19. Schuman H.: The random probe: a technique for evaluating the validity of closed questions. Am. Sociol. Rev. 31, 218–222 (1966)CrossRefGoogle Scholar
  20. Schwarz N.: Cognition and Communication: Judgmental Biases, Research Methods, and the Logic of Conversation. Erlbaum, Mahwah (1996)Google Scholar
  21. Strack F., Martin L.: Thinking, judging, and communicating: a process account of context effects in attitude surveys. In: Hippler, H.-J., Schwarz, N., Sudman, S. (eds) Social Information Processing and Survey Methodology, pp. 123–148. Springer, New York (1987)CrossRefGoogle Scholar
  22. Thrasher J.F., Quah A.C.K., Dominick G., Borland R., Driezen P., Awang R., Omar M., Hosking W., Sirirassamee B., Boado M.: Using cognitive interviewing and behavioral coding to determine measurement equivalence across linguistic and cultural groups: an example from the international tobacco control policy evaluation project. Field Methods 23, 439–460 (2011)CrossRefGoogle Scholar
  23. Tourangeau R., Rips L.J., Rasinski K.: The Psychology of Survey Response. Cambridge University Press, Cambridge (2000)CrossRefGoogle Scholar
  24. Willis G.B.: Cognitive Interviewing: A Tool for Improving Questionnaire Design. Sage, Thousand Oaks (2005)Google Scholar
  25. Willis G.B., Lawrence D., Hartman A., Stapleton Kudela M., Levin K., Forsyth B.: Translation of a tobacco survey into Spanish and Asian languages: the tobacco use supplement to the current population survey. Nicot. Tob. Res. 10, 1075–1084 (2008)CrossRefGoogle Scholar

Copyright information

© The Author(s) 2012

Authors and Affiliations

  • Dorothée Behr
    • 1
  • Michael Braun
    • 1
  • Lars Kaczmirek
    • 1
  • Wolfgang Bandilla
    • 1
  1. 1.GESIS – Leibniz Institute for the Social SciencesMannheimGermany

Personalised recommendations