Quality of Life Research

, Volume 12, Issue 3, pp 261–274 | Cite as

The potential synergy between cognitive models and modern psychometric models

  • Jakob B. Bjorner
  • John E. WareJr.
  • Mark Kosinski

Abstract

Analyses of cognitive aspects of survey methodology (CASM) and psychometric analysis are two methods that are able to complement each other. We use concrete examples to illustrate how psychometric analyses can test hypotheses from CASM. The psychometrics framework recognizes that survey responses are affected by other factors than the concept being assessed, for example by cognitive factors and processes. Such factors are subsumed under the concept of measurement error. Possible sources of measurement error can be tested, e.g. by randomized experiments. A standard way to reduce measurement error is to ask several questions about the same concept and combine the answers into a multi-item scale that is more precise than the individual items. Techniques like structural equation models use the item correlations to assess the magnitude of measurement error and to test the assumptions behind the multi-item scale, e.g. the effect of common response choices and item time frames. A central problem in modern psychometrics is how to model the mapping of the continuous latent variable onto the item response choice categories. This is achieved by threshold models (e.g. item response models and structural equation models for categorical data). These models can, for example, analyze the impact of mode of administration, test whether the items function in the same way for all people (measurement invariance/differential item functioning) and examine the consistency of responses from any single person. Such analyses provide new possibilities for combining psychometrics and cognitive methods.

Cognitive aspects Item response models Psychometrics Questionnaires Structural equation models 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Reference

  1. 1.
    Barofsky I. The cognitive sciences and quality of life assessment. Paper presented at the 5th Annual Conference of the International Society for Quality of Life Research, Baltimore 1998.Google Scholar
  2. 2.
    Tourangeau R, Rips LJ, Rasinski K. The Psychology of Survey Response. Cambridge: Cambridge University Press, 2000.Google Scholar
  3. 3.
    Bjorner JB, Kristensen TS, Orth-Gomér K, Tibblin G, Sullivan M, Westerholm P. Self-rated health: A useful concept in research, prevention and clinical medicine. Stockholm: Swedish Council for Planning and Coordination of Research, 1996.Google Scholar
  4. 4.
    Bjorner JB, Kristensen TS. Multi-item scales for measuring global self-rated health. Investigation of construct validity using structural equations models. Res Aging 1999; 21: 417–439.Google Scholar
  5. 5.
    Ware JE, Jr, Kosinski M, Dewey J. How to Score Version Two of the SF-36 Health Survey. Lincoln, RI: Quality-Metric Inc., 2000.Google Scholar
  6. 6.
    Idler EL, Hudson SV, Leventhal H. The meanings of self-ratings of health. Res Aging 1999; 21: 458–476.Google Scholar
  7. 7.
    Krause NM, Jay GM. What do global self-rated health items measure? Med Care 1994; 32: 930–942.Google Scholar
  8. 8.
    Jylhä M. Self-rated health revisited: Exploring survey interview episodes with elderly respondents. Soc Sci Med 1994; 39: 983–990.Google Scholar
  9. 9.
    Borawski EA, Kinney JM, Kahana E. The meaning of older adults' health appraisals: Congruence with health status and determinant of mortality. J Gerontol B Psychol Sci Soc Sci 1996; 51: S157–S170.Google Scholar
  10. 10.
    Strack F. 'Order Effects' in Survey Research: Activation and information functions of preceding questions. In: Schwarz N, Sudman S (eds), Context Effects in Social and Psychological Research. New York: Springer-Verlag, 1992: 23–34.Google Scholar
  11. 11.
    Tourangeau R. Context effects on responses to attitude questions: Attitudes as memory structures. In: Schwarz N, Sudman S (eds), Context Effects in Social and Psychological Research. New York: Springer-Verlag, 1992: 35–47.Google Scholar
  12. 12.
    Knowles ES, Coker MC, Cook DA, et al. Order effects within personality measures. In: Schwarz N, Sudman S, eds. Context Effects in Social and Psychological Research. New York: Springer-Verlag, 1992: 221–236.Google Scholar
  13. 13.
    Thurstone LL. A method of scaling psychological and educational tests.J Educ Psychol 1925; 16: 433–449.Google Scholar
  14. 14.
    Guttman L. A basis for scaling qualitative data. Am Sociol Rev 1944; 9: 139–150.Google Scholar
  15. 15.
    Muthen BO. A general structural equation model with dichotomous, ordered categorical, and continuous latent variable indicators. Psychometrika 1984; 29: 177–185.Google Scholar
  16. 16.
    Muthen BO. Contributions to factor analysis of dichotomous variables. Psychometrika 1978; 43: 511–560.Google Scholar
  17. 17.
    Lord FM, Norvick MR. Statistical Theories of Mental Test Scores. Reading: Addison-Wesley, 1968.Google Scholar
  18. 18.
    Samejima F. Graded response model. In: van der Linden WJ, Hambleton RK, eds. Handbook of Modern Item Response Theory. Berlin: Springer, 1997: 85–100.Google Scholar
  19. 19.
    Camilli G. Origin of the scaling constant d = 1.7 in item response theory. J Educ Behav Stat 1994; 19: 293–295.Google Scholar
  20. 20.
    Hambleton RK, Swaminathan H, Rogers HJ. Fundamentals of Item Response Theory. London: Sage Publications, 1991.Google Scholar
  21. 21.
    Muthen BO, Christoffersson A. Simultaneous factor analysis of dichotomous variables in several groups. Psychometrika 1981; 46: 407–419.Google Scholar
  22. 22.
    Fischer GH, Molenaar IW. Rasch Models-Foundations, Recent Developments, and Applications. Berlin: Springer-Verlag, 1995.Google Scholar
  23. 23.
    McHorney CA, Kosinski M, Ware JE Jr. Comparisons of the costs and quality of norms for the SF-36 health survey collected by mail versus telephone interview: Results from a national survey. Med Care 1994; 32: 551–567.Google Scholar
  24. 24.
    Embretson SE. A cognitive design system approach to generating valid tests: Application to abstract reasoning. Psychol Meth 1998; 3: 380–396.Google Scholar
  25. 25.
    Reise SP, Widaman KF, Pugh RH. Confirmatory factor analysis and item response theory: Two approaches for exploring measurement invariance. Psychol Bull 1993; 114: 552–566.Google Scholar
  26. 26.
    Bjorner JB, Kreiner S, Ware JE Jr. Damsgaard MT, Bech P. Differential item functioning in the Danish translation of the SF-36. J Clin Epidemiol 1998; 51: 1189–1202.Google Scholar
  27. 27.
    Raczek AE, Ware JE, Bjorner JB, et al. Comparison of Rasch and summated rating scales constructed from SF-36 physical functioning items in seven countries: Results from the IQOLA Project. International Quality of Life Assessment. J Clin Epidemiol 1998; 51: 1203–1214.Google Scholar
  28. 28.
    Bjorner JB, Ware JE Jr. Using modern psychometric methods to measure health outcomes. Med Outcomes Trust Monitor 1998; 3: 12–16.Google Scholar
  29. 29.
    Drasgow F, Levine MV, Williams EA. Appropriateness measurement with polychotomous item response models and standardized indices. Br J Math Stat Psychol 1985; 38: 67–86.Google Scholar
  30. 30.
    Muthen BO, Muthen L. Mplus User's Guide. Los Angeles: Muthén and Muthén, 1998.Google Scholar

Copyright information

© Kluwer Academic Publishers 2003

Authors and Affiliations

  • Jakob B. Bjorner
    • 1
    • 2
  • John E. WareJr.
    • 2
  • Mark Kosinski
    • 2
  1. 1.National Institute of Occupational HealthDenmark
  2. 2.QualityMetric IncRhode IslandUSA

Personalised recommendations