Measurement invariance of the PROMIS pain interference item bank across community and clinical samples
- First Online:
- 435 Downloads
This study examined the measurement invariance of responses to the patient-reported outcomes measurement information system (PROMIS) pain interference (PI) item bank. The original PROMIS calibration sample (Wave I) was augmented with a sample of persons recruited from the American Chronic Pain Association (ACPA) to increase the number of participants reporting higher levels of pain. Establishing measurement invariance of an item bank is essential for the valid interpretation of group differences in the latent concept being measured.
Multi-group confirmatory factor analysis (MG-CFA) was used to evaluate successive levels of measurement invariance: configural, metric, and scalar invariance.
Support was found for configural and metric invariance of the PROMIS-PI, but not for scalar invariance.
Conclusions and recommendations
Based on our results of MG-CFA, we recommend retaining the original parameter estimates obtained by combining the community sample of Wave I and ACPA participants. Future studies should extend this study by examining measurement equivalence in an item response theory framework such as differential item functioning analysis.
KeywordsFactor analysis Pain interference Pain measurement Patient outcome measures Psychometrics
American Chronic Pain Association
Confirmatory factor analysis
Item response theory
Multi-group confirmatory factor analysis
Patient-Reported Outcomes Measurement Information System
- 3.Riley, W., Rothrock, N., Bruce, B., Christodolou, C., Cook, K., & Hahn, E. A. (2010). Patient-reported outcomes measurement information system (PROMIS) domain names and definitions revisions: further assessment of content validity in IRT-derived item banks. Quality of Life Research, 19(9), 1311–1321.PubMedCrossRefGoogle Scholar
- 5.Samejima, F. (1969). Estimation of latent ability using a response pattern of graded scores. Psychometrika Monograph Supplement, 34 (4, Pt. 2, No 17).Google Scholar
- 8.Rothrock, N. E., Hays, R. D., Spritzer, K., Yount, S. E., Riley, W., & Cella, D. (2010). Relative to the general US population, chronic diseases are associated with poorer health–related quality of life as measured by the patient–reported outcomes measurement information system (PROMIS). Journal of Clinical Epidemiology, 63(11), 1195–1204.PubMedCrossRefGoogle Scholar
- 12.Muthén, L. K., & Muthén, B. O. (1998–2010). Mplus user’s guide. 6th ed. Los Angeles, CA: Muthén & Muthén.Google Scholar
- 15.Byrne, B. M. (1998). Structural equation modeling with LISREL, PRELIS, and SIMPLIS. Hillsdale, NJ: Lawrence Erlbaum.Google Scholar
- 16.Steiger, J. H., & Lind, J. C. (1980). Statistically-based tests for the number of common factors. In Paper presented at the annual spring meeting of the Psychometric Society, Iowa City, IA.Google Scholar
- 18.Browne, M., & Cudeck, R. (1993). Alternative ways of assessing model fit. In K. Bollen & J. Long (Eds.), Testing structural equation models (pp. 136–162). London, England: Sage.Google Scholar
- 19.Marsh, H. W., Hau, K., & Wen, Z. (2004). In search of golden rules: Comment on hypothesis-testing approaches to setting cutoff values for fit indexes and dangers in overgeneralizing Hu and Bentler’s (1999) findings. Structural Equation Modeling: A Multidisciplinary Journal, 11(3), 320–341.CrossRefGoogle Scholar
- 23.Mora, P. A., Contrada, R. J., Berkowitz, A., Musumeci-Szabo, T., Wisnivesky, J., & Halm, E. A. (2009). Measurement invariance of the mini asthma quality of life questionnaire across African–American and Latino adult asthma patients. Quality of Life Research, 18(3), 371–380.PubMedCrossRefGoogle Scholar
- 24.Hill, C.D., Edwards, M.C., Thissen, D., Langer, M.M., Wirth, R.J., Burwinkle, T. M., et al. (2007). Practical issues in the application of item response theory: A demonstration using items from the Pediatric Quality of Life Inventory™ (PedsQL™) 4.0 Generic Core Scales. Medical Care, 45(5 Suppl 1), 39–47.Google Scholar