Advertisement

Quality of Life Research

, Volume 29, Issue 1, pp 201–211 | Cite as

Psychometric validation of PROMIS® Anxiety and Depression Item Banks for the Brazilian population

  • Natália Fontes Caputo de CastroEmail author
  • Rogério de Melo Costa Pinto
  • Tânia Maria da Silva Mendonça
  • Carlos Henrique Martins da Silva
Article

Abstract

Purpose

Scientific evidence indicates that depression and anxiety symptoms may be understood as risk factors associated with the incidence and progression of chronic diseases. Considering the lack of mental health assessment tools that meet strict methodological standards, the authors have chosen to validate the psychometric properties of Anxiety and Depression Item Banks - Emotional Distress domain of the Patient-Reported Outcomes Measurement Information System (PROMIS®) for the Brazilian population.

Methods

In this study, 606 adults responded to the self-administered Anxiety and Depression Item Banks, which were calibrated using Factor Analyses (Exploratory and Confirmatory analysis) and adjustment of the Graded Response Model. Transcultural validity was assessed by Differential Item Functioning (DIF).

Results

The two-factor analysis confirmed the unidimensionality of Emotional Distress Items (CFI = 0.96, TLI = 0.96, RMSEA = 0.05). The residual correlation matrix did not identify item pairs with local dependence. Indicators marked with DIF presented a low impact for gender, age, and language variables. The instrument demonstrated greater reliability in the moderate-severe range, indicating that the error reduction is reflected in the − 1.0 to + 3.0 amplitude.

Conclusion

The psychometric measurements of Anxiety and Depression Item Banks in the Brazilian version were equivalent to those in the original version. Additional research contemplating patients with different levels of emotional distress are necessary to better comprehend the results obtained in this study.

Keywords

PROMIS® Emotional distress PRO IRT Adults 

Notes

Acknowledgements

We would like to thank the PROMIS® network for their technical support in the cultural adaptation, particularly the researchers at the Medical Social Sciences Department of Northwestern University, Chicago, USA), David Cella, PhD Helena Correia, FACITtrans Director, Benjamin Arnold (Elmhurst, USA). We thank the Research Support Foundation of the state of Minas Gerais (FAPEMIG) for the financial support (PPM-00303-08) and the Quality of Life research group of the School of Medicine of the Federal University of Uberlândia (FAMED-UFU).

Funding

This study was funded by the Research Support Foundation of the state of Minas Gerais (FAPEMIG) (Grant Number: PPM-00303-08).

Compliance with ethical standards

Conflict of interests

The authors declare that they have no conflict of interest

Ethics approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.

Informed consent

Informed consent was obtained from all individual participants included in the study.

References

  1. 1.
    Doward, L. C., & McKenna, S. P. (2004). Defining patient-reported outcomes. Value in Health,7(Suppl 1), S4–S8.CrossRefGoogle Scholar
  2. 2.
    U.S. Department of Health and Human Services, FDA Center for Drug Evaluation and Research, FDA Center for Biologics Evaluation and Research, & FDA Center for Devices and Radiological Health. (2006). Guide for industry: Patient-reported outcome measures: Use in medical product development to support labeling claims: Draft guidance. Health and Quality of Life Outcomes,4, 79.CrossRefGoogle Scholar
  3. 3.
    Ader, D. N. (2007). Developing the patient-reported outcomes measurement information system (PROMIS). Medical Care,45(5 Suppl 1), S1–S2.CrossRefGoogle Scholar
  4. 4.
    Cella, D., Yount, S., Rothrock, N., Gershon, R., Cook, K., Reeve, B., et al. (2007). The Patient-Reported Outcomes Measurement Information System (PROMIS): Progress of an NIH Roadmap cooperative group during its first two years. Medical Care,45(5 Suppl 1), 3–11.CrossRefGoogle Scholar
  5. 5.
    Cella, D., Riley, W., Stone, A., Rothrock, N., Reeve, B., Yount, S., et al. (2010). The Patient-Reported Outcomes Measurement Information System (PROMIS) developed and tested its first wave of adult self-reported health outcome item banks: 2005–2008. Journal of Clinical Epidemiology,63(11), 1179–1194.CrossRefGoogle Scholar
  6. 6.
    Pilkonis, P. A., Choi, S. W., Reise, S. P., Stover, A. M., Riley, W. T., Cella, D., et al. (2011). Item banks for measuring emotional distress from the Patient-Reported Outcomes Measurement Information System (PROMIS®): Depression, anxiety, and anger. Assessment,18(3), 263–283.CrossRefGoogle Scholar
  7. 7.
    Cella, D. (2015). PROMIS 1 Wave 1. Harvard Dataverse, v1.  https://doi.org/10.7910/DVN/0NGAKG.
  8. 8.
    Riley, W. T., Pilkonis, P. A., & Cella, D. (2011). Application of the National Institutes of Health Patient-Reported Outcomes Measurement Information System (PROMIS®) to Mental Health Research. The Journal of Mental Health Policy and Economics,14(4), 201–208.PubMedPubMedCentralGoogle Scholar
  9. 9.
    Promis Cooperative Group. (2008). Unpublished manual for the Patient-Reported Outcomes Measurement Information System (PROMIS) (Version 1.1). Retrieved 5 Aug, 2018, from https://www.nihpromis.org.
  10. 10.
    Chang, C. H. (2007). Patient-reported outcomes measurement and management with innovative methodologies and technologies. Quality of Life Research,16(Suppl 1), 157–166.CrossRefGoogle Scholar
  11. 11.
    DeWalt, D. A., Rothrock, N., Yount, S., & Stone, A. A. (2007). Evaluation of item candidates: The PROMIS qualitative item review. Medical Care,45(5 Suppl 1), S12–S21.CrossRefGoogle Scholar
  12. 12.
    Reeve, B. B., Hays, R. D., Bjorner, J. B., Cook, K. F., Crane, P. K., Teresi, J. A., et al. (2007). Psychometric evaluation and calibration of health-related quality of life item banks: Plans for the Patient-Reported Outcomes Measurement Information System (PROMIS). Medical Care,45(5 Suppl 1), S22–S31.CrossRefGoogle Scholar
  13. 13.
    Gibbons, R. D., Clark, D. C., Cavanaugh, S. A., & Davis, J. M. (1985). Application of modern psychometric theory in psychiatric research. Journal of Psychiatric Research,19(1), 43–55.CrossRefGoogle Scholar
  14. 14.
    Bernstein, I. H., Rush, A. J., Thomas, C. J., Woo, A., & Trivedi, M. H. (2006). Item response analysis of the inventory of depressive symptomatology. Neuropsychiatry Disease and Treatment,2(4), 557–564.CrossRefGoogle Scholar
  15. 15.
    Kendel, F., Wirtz, M., Dunkel, A., Lehmkuhl, E., Hetzer, R., & Regitz-Zagrosek, V. (2010). Screening for depression: Rasch analysis of the dimensional structure of the PHQ-9 and the HAD-S. Journal of Affective Disorders,122(3), 241–246.CrossRefGoogle Scholar
  16. 16.
    Castro, S. M. J., Trendini, C., & Riboldi, J. (2010). Teoria da resposta ao item aplicada ao Inventário de Depressão Beck. Revista Brasileira de Epidemiologia,13(3), 487–501.CrossRefGoogle Scholar
  17. 17.
    Lord, F. M. (1980). Applications of item response theory to practical testing problems. Hillsdale: Lawrence Erlbaum.Google Scholar
  18. 18.
    Embreston, S. E., & Reise, S. P. (2000). Item response theory for psychologists. Mahwah: Lawrence Erlbaum.Google Scholar
  19. 19.
    Pasquali, L. (2007). Teoria de Resposta ao Item: Teoria, procedimentos e aplicações. Brasília: LabPAM, UNB.Google Scholar
  20. 20.
    Bjorner, J. B., Chang, C. H., Thissen, D., & Reeve, B. B. (2007). Developing tailored instruments: Item baking and computerized adaptive assessment. Quality of Life Research,16(Suppl 1), 95–108.CrossRefGoogle Scholar
  21. 21.
    Thissen, D., Reeve, B. B., Bjorner, J. B., & Chang, C. H. (2007). Methodological issues for building item banks and computerized adaptive scales. Quality of Life Research,16(Suppl 1), 109–119.CrossRefGoogle Scholar
  22. 22.
    Gibbons, R. D., Weiss, D. J., Kupfer, D. J., Frank, E., Fagiolini, A., & Grochocinski, V. J. (2008). Using computerized adaptive testing to reduce the burden of mental health assessment. Psychiatric Services,59(4), 361–368.CrossRefGoogle Scholar
  23. 23.
    Gibbons, R. D., Weiss, D. J., Frank, E., & Kupfer, D. (2016). Computerized adaptive diagnosis and testing mental health disorders. Annual Review of Clinical Psychology,12, 83–104.CrossRefGoogle Scholar
  24. 24.
    Promis Cooperative Group. (2015). Minimum requirements for the release of PROMIS instruments after translation and recommendations for further psychometric evaluation. Available translations. Retrieved 5 Aug, 2018, from http://www.healthmeasures.net/explore-measurement-systems/promis/intro-to-promis/available-translations.
  25. 25.
    Castro, N. F. C., Rezende, C. H. A., Mendonça, T. M. S., Silva, C. H. M., & Pinto, R. M. C. (2014). Adaptação Transcultural dos Bancos de Itens de Ansiedade e Depressão do Patient-Reported Outcomes Measurement Information System (PROMIS) para a língua portuguesa. Cadernos de Saúde Pública,30(4), 879–884.CrossRefGoogle Scholar
  26. 26.
    Mokkink, L. B., Terwee, C. B., Patrick, D. L., Alonso, J., Stratford, P. W., Knol, D. L., et al. (2012). The COSMIN checklist manual. Amsterdam Public Health. Retrieved 5 Aug, 2018, from https://www.cosmin.nl.
  27. 27.
    Timmerman, M. E., & Lorenzo-Seva, U. (2011). Dimensionality assessment of ordered polytomous items with parallel analysis. Psychological Methods,16(2), 209–220.CrossRefGoogle Scholar
  28. 28.
    Baglin, J. (2014). Improving your exploratory factor analysis for ordinal data: A demonstration Using FACTOR. Practical Assessment, Research & Evaluation,19(5), 1–15.Google Scholar
  29. 29.
    Chen, F. F., West, S. G., & Sousa, K. H. (2006). A comparison of bifactor and second-order models of quality of life. Multivariate Behavioral Research,41(2), 189–225.CrossRefGoogle Scholar
  30. 30.
    Reise, S. P., Morizot, J., & Hays, R. D. (2007). The role of the bifator model resolving dimensionality issues in health outcomes measures. Quality of Life Research,16(Suppl 1), 19–31.CrossRefGoogle Scholar
  31. 31.
    Hair, J. F., Anderson, R. E., Tatham, R. L., & Black, W. C. (2006). Multivariate data analysis. New Jersey: Pearson.Google Scholar
  32. 32.
    Samejima, F. (1968). Estimation of latent ability using a response pattern of graded scores. Princenton: Educational Testing Service.Google Scholar
  33. 33.
    McHorney, C. A., Ware, J. E., Jr., Lu, J. F., & Sherbourne, C. D. (1994). The MOS 36 item short-form healthy survey (SF-36): III. Test of data quality, scaling assumptions and reliability across diverse patient groups. Medical Care,32(1), 40–66.CrossRefGoogle Scholar
  34. 34.
    Thissen, D., Chen, W. H., & Bock, R. D. (2003). MULTILOG (version 7) [computer software]. Lincolnwood: Scientifics Software International.Google Scholar
  35. 35.
    Holland, P. W., & Wainer, H. (1993). Differential item functioning. Hillsdale: Lawrence Erlbaum Associates.Google Scholar
  36. 36.
    Nagelkerke, N. J. D. (1991). A note on a general definition of the coefficient of determination. Biometrika,78(3), 691–692.CrossRefGoogle Scholar
  37. 37.
    Ladwig, R. (2012). Detecção de Funcionamento Diferencial do Item através da Regressão Logística e da Teoria de Resposta ao Item: Uma interface gráfica. Monografia [Bacharel em Estatística]: Universidade Federal do Rio Grande do Sul. Retrieved 5 Aug, 2018, from, https://www.lume.ufrgs.br/bitstream/handle/10183/60382/000862409.pdf?sequence=1.
  38. 38.
    Zumbo, B. D. (1999). A handbook on the theory and methods of differential item functioning (DIF): Logistic regression modeling as a unitary framework for binary and liket-type (ordinal) item scores. Ottawa: Directorate of Human Resources Research and Evaluation, Department of National Defense.Google Scholar
  39. 39.
    Lorenzo-Seva, U., & Fernando, P. J. (2006). FACTOR: A computer program to fit the exploratory factor analysis model. Behavior Research Methods,38(1), 88–91.CrossRefGoogle Scholar
  40. 40.
    Muthén, L., & Muthén, B. (2006). Mplus user´s guide (2nd ed.). Los Angeles: Muthen & Muthen.Google Scholar
  41. 41.
    Choi, S. W., Gibbons, L. E., & Crane, P. K. (2011). lordif: An R package for detecting differential item functioning using iterative hybrid ordinal logistic regression/item response theory and monte carlo simulations. Journal of Statistical Software,39(8), 1–30.CrossRefGoogle Scholar
  42. 42.
    Batterham, P. J., Sunderland, M., Carragher, N., & Calear, A. L. (2017). Psychometric properties of 7 and 30-day version of the PROMIS emotional distress item banks in an Australian adult sample. Assessment.  https://doi.org/10.1177/1073191116685809.CrossRefPubMedGoogle Scholar
  43. 43.
    Baker, F. B. (2001). The basics of item response theory (2nd ed.). Washington, DC: ERIC Clearinghouse on Assessment and Evaluation.Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Natália Fontes Caputo de Castro
    • 1
    Email author
  • Rogério de Melo Costa Pinto
    • 2
  • Tânia Maria da Silva Mendonça
    • 3
  • Carlos Henrique Martins da Silva
    • 4
  1. 1.Post-graduate Program in Health Sciences, Quality of Life Research Group, School of MedicineFederal University of UberlândiaUberlândiaBrazil
  2. 2.School of Mathematics, Quality of Life Research Group, School of MedicineFederal University of UberlândiaUberlândiaBrazil
  3. 3.School of Medicine, Quality of Life Research Group, School of MedicineFederal University of UberlândiaUberlândiaBrazil
  4. 4.Pediatrics Department and Post-graduate Program in Health Sciences, Quality of Life Research Group, School of MedicineFederal University of UberlândiaUberlândiaBrazil

Personalised recommendations