Quality & Quantity

, Volume 50, Issue 3, pp 1021–1040 | Cite as

Cognitive interviewing as tool for enhancing the accuracy of the interpretation of quantitative findings

  • Pamela CampanelliEmail author
  • Michelle Gray
  • Margaret Blake
  • Steven Hope


This paper contrasts findings from a quantitative survey with those from a cognitive interviewing follow-up investigation on a subset of the same respondents. The data were gathered as part of a larger study to explore measurement error across three modes of data collection, but this paper focuses on the question format experiments rather than the mode effects part of the larger study. Three examples are presented which demonstrate how cognitive interviewing can cast new light on quantitative results by increasing the accuracy of the inferences made. These include instances where: (1) quantitative indicators of poor respondent behaviour (e.g., acquiescence bias on agree/disagree questions) are over-estimates, (2) similar quantitative response distributions across satisfaction and behavioural questions (from a fully-labelled versus end-labelled experiment) imply similar respondent satisficing behaviour, but cognitive interviews show that different response processes are at work and (3) unlikely quantitative findings (from an experiment comparing 3 vs. 7 or 8 response options) could easily be dismissed as due to chance but were instead the result of unforeseen respondent difficulties. The paper concludes with a discussion of the value of using a cognitive interviewing follow-up study as a tool in the interpretation of ambiguous quantitative findings.


Cognitive interviewing Satisficing Acquiescence End-labelled Polar point Number of response options 



The support of the UK Economic and Social Research Council (ESRC) is gratefully acknowledged. This work was funded by Grant number RES-175-25-0007. The authors also thank core project team members: Gerry Nicolaas (Ipsos MORI), Peter Lynn and Annette Jäckle, (Institute for Social and Economic Research).


  1. AAPOR Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys. AAPOR, Ann Arbor. (2011). Accessed 06 Jan 2012
  2. Ainsworth, B.: Issues in the assessment of physical activity in women. Res. Q. Exerc. Sport. 71, 37–42 (2000)Google Scholar
  3. Ayidiya, S., McClendon, M.: Response effects in mail surveys. Public Opin. Q. 54, 229–247 (1990)CrossRefGoogle Scholar
  4. Beatty, P.: The dynamics of cognitive interviewing. In: Presser, S., Rothgeb, J., Couper, M., Lessler, J., Martin, E., Martin, J., Singer, E. (eds.) Methods for Testing and Evaluating Survey Questionnaires, pp. 45–66. Wiley, Hoboken, NJ (2004)CrossRefGoogle Scholar
  5. Beatty, P., Willis, G.: Research synthesis: the practice of cognitive interviewing. Public Opin. Q. 71, 287–311 (2007)CrossRefGoogle Scholar
  6. Campanelli, P., Blake, M., Gray, M., Hope, S: Mixed modes and measurement error: using cognitive interviewing to explore the results of a mixed modes experiment. Paper presented at the 65th Annual Conference of the American Association for Public Opinion Research (2010)Google Scholar
  7. Carrasco, L.: The American community survey (ACS) en español: using cognitive interviews to test the functional equivalency of questionnaire translations. statistical research division study series, survey methodology 2003. U.S. Census Bureau, Washington, DC (2003)Google Scholar
  8. Converse, J., Presser, S.: Survey questions: handcrafting the standardized questionnaire. Sage, Thousand Oaks, CA (1986)CrossRefGoogle Scholar
  9. Davis, E., Nicolas, C., Waters, E., Cook, K., Gibbs, L., Gosch, A., Ravens-Sieberer, U.: Parent-proxy and child self-reported health-related quality of life: using qualitative methods to explain the discordance. Qual. Life Res. 16(5), 863–871 (2007)CrossRefGoogle Scholar
  10. DeMaio, T., Rothgeb, J.: Cognitive interviewing techniques in the lab and in the field. In: Schwarz, N., Sudman, S. (eds.) Answering Questions: methodology for Determining Cognitive and Communicative Processes in Survey Research, pp. 177–196. Jossey-Bass, San Francisco, CA (1996)Google Scholar
  11. DeVellis, R.: Scale Development: theory and Applications, 3rd edn. Sage, Thousand Oaks, CA (2012)Google Scholar
  12. Ericsson, K., Simon, H.: Protocol analysis: verbal reports as data. MIT Press, Cambridge, MA (1984)Google Scholar
  13. Dillman, D., Christian, L.M.: Survey mode as a source of instability in responses across surveys. Field Methods. 17(1), 30–52 (2005)CrossRefGoogle Scholar
  14. Fowler Jr, F.J.: Improving Survey Questions: design and Evaluation. Sage, Thousand Oaks, CA (1995)Google Scholar
  15. Gerber, E.: The view from anthropology: ethnography and the cognitive interview. In: Sirken, M., Herrmann, D., Schechter, S., Schwarz, N., Tanur, J., Tourangeau, R. (eds.) Cognition and Survey Research. Wiley, New York (1999)Google Scholar
  16. Gray, M., Blake, M., Campanelli, P.: The use of cognitive interviewing methods to evaluate mode effects in survey questions. Field Methods. 26(2), 156–171 (2014)CrossRefGoogle Scholar
  17. Goerman, P.: Adapting cognitive interview techniques for use in pretesting Spanish language survey instruments. Statistical research division research report series, survey methodology 2006. U.S. Census Bureau, Washington, DC (2006)Google Scholar
  18. Goggin, S., Stoker, L.: Optimal scale length and single-item attitude measures: evidence from simulations and a two-wave experiment, APSA 2014 Annual Meeting Paper. (2014). Accessed 10 Jul 2014
  19. Hope, S., Campanelli, P., Nicolaas, G., Lynn, P., Jäckle, A: The role of the interviewer in producing mode effects: results from a mixed modes experiment comparing face-to-face, telephone and web administration. ISER working paper no. 20141–20. ISER, University of Essex (2014)Google Scholar
  20. Jakwerth, P., Frances Stancavage, F., Reed, E.: An investigation of why students do not respond to questions. Working paper no. 2003–12. National Centre for Education Statistics, U.S. Department of Education, Washington, DC (1999)Google Scholar
  21. Javeline, D.: Response effects in polite cultures: a test of acquiescence in Kazakhstan. Public Opin. Q. 63, 1–28 (1999)CrossRefGoogle Scholar
  22. Krosnick, J.: Response strategies for coping with the cognitive demands of attitude measures in surveys. Appl. Cogn. Psychol. 5, 213–236 (1991)CrossRefGoogle Scholar
  23. Krosnick, J.: The Threat of Satisficing in Surveys: The Shortcuts Respondents Take in Answering Questions, Survey Methods Newsletter. National Centre for Social Research, London, UK (2000)Google Scholar
  24. Krosnick, J., Fabrigar, L.: Designing rating scales for effective measurement in surveys. In: Lyberg, L., Biemer, P., Collins, M., Leeuw, E., Dippo, C., Schwarz, N., Trewin, D. (eds.) Survey Measurement and Process Quality, pp. 141–164. Wiley, Holboken, NJ (1997)Google Scholar
  25. Landsberger, H., Saavedra, A.: Response set in developing countries. Public Opin. Q. 31, 214–229 (1967)CrossRefGoogle Scholar
  26. Lenski, G., Leggett, J.: Caste, class, and deference in the research interview. Am. J. Sociol. 65, 463–467 (1960)CrossRefGoogle Scholar
  27. Levin, K., Willis, G., Forsyth, B., Norberg, A., Kudela, M., Strack, D., Thompson, F.: Using cognitive interviews to evaluate the Spanish-language translation of a dietary questionnaire. Surv Res. Methods. 3(1), 13–25 (2009)Google Scholar
  28. Lynn, P., Hope, S., Jäckle, A., Campanelli, P., Nicolaas, G: Effects of visual and aural communication of categorical response options on answers to survey questions, ISER Working paper no. 2012–21. ISER, University of Essex (2012)Google Scholar
  29. McBride, L., Moran, G.: Double agreement as a function of ambiguity and susceptibility to demand implications of the psychological situation. J. Personal. Social Psychol. 6, 115–118 (1967)CrossRefGoogle Scholar
  30. Miller, K.: Results of the comparative cognitive test workgroup: Budapest Initiative Model. National Center for Health Statistics, Hyattsville, MD (2008)Google Scholar
  31. Narayan, S., Krosnick, J.: Double agreement as a function of ambiguity and susceptibility to demand implications of the psychological situation. J. Public Opin. Q. 60, 58–88 (1996)CrossRefGoogle Scholar
  32. Nicolaas, G., Campanelli, P., Hope, S., Jäckle, A., Lynn, P: Is it a good idea to optimise question format for mode of data collection? Results from a mixed modes experiment. ISER Working paper no. 2011–31. ISER, University of Essex (2011)Google Scholar
  33. Padilla, J.: The use of cognitive interviewing to assess “construct overlap” in cross-cultural surveys. Paper presented at second Conference of European Survey Research Association, Prague (2007)Google Scholar
  34. Pan, Y.: Cognitive interviews in languages other than english: methodological and research issues. 2004 American Statistical Association Proceedings of the Joint Statistical Meetings, Statistical Computing Section [CD-ROM], pp. 4859–4865. American Statistical Association, Alexandria, VA (2004)Google Scholar
  35. Schuman, H., Presser, S.: Questions and Answers in Attitude Surveys. Academic Press, New York (1981)Google Scholar
  36. Schwarz, N., Hippler, H.: Elisabeth Noelle-Neumann, E.: A cognitive model of respondent-order effects in survey measurement. In: Schwarz, N., Sudman, S. (eds.) Context Effects in Social and Psychological Research. Springer-Verlag, New York (1992)CrossRefGoogle Scholar
  37. Toepoel, V., Dillman, D.: Words, numbers, and visual heuristics in web surveys: is there a hierarchy of importance? Soc. Sci. Comput. Rev. 29(2), 193–207 (2011)CrossRefGoogle Scholar
  38. Thomas, R., Uldall, B., Krosnick, J.: How many are too many?: number of response categories and validity. Paper presented at the 59th Annual Conference of the American Association of Public Opinion Research (2004)Google Scholar
  39. Tourangeau, R.: Cognitive science and survey methods: a cognitive perspective. In: Jabine, T., Straf, M., Tanur, J., Tourangeau, R. (eds.) Cognitive Aspects of Survey Design: building a Bridge Between the Disciplines, pp. 73–100. National Academy Press, Washington, DC (1984)Google Scholar
  40. Tourangeau, R., Rips, L.J., Rasinski, K.: The Psychology of Survey Response. Cambridge University Press, Cambridge (2000)CrossRefGoogle Scholar
  41. Warnecke, R., Johnson, T., Chavez, N., Sudman, S., O’Rourke, D., Lacey, L., Horm, J.: Improving question wording in a survey of culturally diverse populations. Ann. Epidemiol. 7, 334–342 (1997)CrossRefGoogle Scholar
  42. Willis, G.: Cognitive Interviewing: a Tool for Improving Questionnaire Design. Sage, Thousand Oaks, CA (2005)Google Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2015

Authors and Affiliations

  • Pamela Campanelli
    • 1
    Email author
  • Michelle Gray
    • 2
  • Margaret Blake
    • 2
  • Steven Hope
    • 3
  1. 1.The Survey CoachColchesterUK
  2. 2.NatCen Social ResearchLondonUK
  3. 3.Institute of Child HealthUniversity College LondonLondonUK

Personalised recommendations