Advertisement

An Exploratory Evaluation Framework for e-Clinical Data Management Performance

  • HyunJu LeeEmail author
  • Sangwon Lee
Clinical Trials

Abstract

Electronic data management is becoming important to reduce the overall cost and run-time of clinical trials with enhanced data quality. It is also imperative to meet regulated guidelines for the overall quality and safety of electronic clinical trials. The purpose of this paper is to develop an exploratory performance evaluation framework for e-clinical data management. This study performs a Delphi survey for 3 iterative rounds to develop an exploratory framework based on key informants’ knowledge. Four key metrics in the areas of infrastructure, intellectual preparation, study implementation, and study completion covering major aspects of clinical trial processes are proposed. Performance measures evaluate the extent of regulation compliance, data quality, cost, and efficiency of the electronic data management process. They also provide measurement indicators for each evaluation item. Based on the key metrics, the performance evaluation framework is developed in three major areas involved in clinical data management—clinical site, monitoring, and data coordinating center. From this initial attempt to evaluate the extent of electronic data management in clinical trials by a Delphi survey, further empirical studies are planned and recommended.

Keywords

clinical trials data management electronic data capture system performance metrics 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Marks R. Validating electronic source data in clinical trials. Control Clin Trials. 2004;25;437–446.CrossRefGoogle Scholar
  2. 2.
    Welker J. Implementation of electronic data capture systems: barriers and solution. Contemp Clin Trials. 2007;29;329–336.CrossRefGoogle Scholar
  3. 3.
    Arab L, Hahn H, Henryo J, Chacko E, Winter A, Cambou M. Using the web for recruitment, screen, tracking. data management, and quality control in dietary assessment clinical validation trial. Contemp Clin Trials. 2007;31:138–146.CrossRefGoogle Scholar
  4. 4.
    Lu Z. Technical challenges in designing post-marketing eCRFs to Address clinical safety and pharmacovigilance needs. Contemp Clin Trials. 2010;31:108–118.CrossRefGoogle Scholar
  5. 5.
    Broeck J, Cunningham S, Eeckels R, Herbst K. Data cleaning: detecting, diagnosing, and editing data abnormality. PLoS Med. 2005;2(10):966–970. https://doi.org/www.plosmedicine.org/article/info%3Adoi%2F10.1371%2Fjournal.pmed.0020267. Accessed June 8, 2012.Google Scholar
  6. 6.
    Knatterud G, Ricjgikdm F, Gerigem S, et al. Guidelines for quality assurance in multicenter trials: a position paper. Control Clin Trials. 1993;19:477–493.CrossRefGoogle Scholar
  7. 7.
    Moher D, Jada A, Nichol G, Penman M, Tugwell P, Walsh S. Assessing the quality of randomized controlled trials: an annotated bibliography of scales and checklists. Control Clin Trials.1995;16:62–73.CrossRefGoogle Scholar
  8. 8.
    Nahm M, Pieper C, Cunningham M. Quantifying data quality for clinical trials using electronic data. PLoS Med. 2008;3(8):1–8. https://doi.org/www.plosone.org/article/info:doi%2F10.1371%2Fjournal.pone.0003049. Accessed June 8, 2012.Google Scholar
  9. 9.
    Lee H, Choi I. Effect analysis of electronic clinical trial systems using measurement indicator of efficiency. Journal of Korea Contents Association. 2011;11(1).CrossRefGoogle Scholar
  10. 10.
    Pavlovic I, Kern T, Miklavcic D. Comparison of paper-based and electronic data collection process in clinical trials: costs simulation study. Contemp Clin Trials. 2009;30:300–316.CrossRefGoogle Scholar
  11. 11.
    Ene-Idordache B, Carminati S, Antiga L, et al. Developing regulatory-compliant electronic case report forms for clinical trials: experience with demand trial. Journal of America Information Association. 2009;16:404–408.CrossRefGoogle Scholar
  12. 12.
    Dixon S. Leveraging next generation technologies to effectively conduct post marketing surveillance. Paper presented at: The 14th Annual Workshop in DIA Japan for Clinical Data Management: New Genesis of CDM to Drive Worldwide Clinical Studies; January 27–28, 2011; Tokyo, Japan.Google Scholar
  13. 13.
    Prokscha S. Practical Guide to Clinical Data Management. London: Taylor and Francis; 2007.Google Scholar
  14. 14.
    US Food and Drug Administration. Guide for Industry; Computerized Systems Used in Clinical Investigations (CSUCI). Washington, DC: FDA; 2005.Google Scholar
  15. 15.
    International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use Guidance for Industry. E6 Good Clinical Practice. https://doi.org/www.ich.org/products/guidelines/efficacy/efficacy-single/article/good-clinical-practice.html. Accessed June 8, 2012.
  16. 16.
    Korean Good Clinical Practice. Korea: KFDA; 2009.Google Scholar
  17. 17.
    Okoli C, Pawlowski S. The Delphi method as a research tool: an example, design considerations and applications. Inform Manage.2004;42:15–42.CrossRefGoogle Scholar
  18. 18.
    Schmidt R. Managing Delphi surveys using nonparametric statistical techniques. Decision Sci. 1997;28(3):763–774.CrossRefGoogle Scholar
  19. 19.
    Stanley R, Lillis F, Zuspan S, et al. Development and implementation of a performance measure tool in an academic pediatric research network. Contemp Clinl Trials. 2010;31:429–437.CrossRefGoogle Scholar
  20. 20.
    Edmondson A, Bohmer R, Pisano G. Disrupted routines: team learning and new technology implementation in hospitals. Admin Sci Quart. 2001;46(4):685–716.CrossRefGoogle Scholar
  21. 21.
    Marray J, Hammon J. Delphi: a versatile methodology for conducting qualitative research. Rev High Educ. 1995;18(4):423–436.CrossRefGoogle Scholar

Copyright information

© Drug Information Association, Inc 2012

Authors and Affiliations

  1. 1.The Catholic University of KoreaSeoulKorea
  2. 2.Wonkwang UniversityIksanSouth Korea

Personalised recommendations