Journal of Gastrointestinal Surgery

, Volume 23, Issue 2, pp 367–376 | Cite as

Do Diagnostic and Procedure Codes Within Population-Based, Administrative Datasets Accurately Identify Patients with Rectal Cancer?

  • Reilly P. MusselmanEmail author
  • Tara Gomes
  • Deanna M. Rothwell
  • Rebecca C. Auer
  • Husein Moloo
  • Robin P. Boushey
  • Carl van Walraven
Original Article



Procedural and diagnostic codes may inaccurately identify specific patient populations within administrative datasets.


Measure the accuracy of previously used coding algorithms using administrative data to identify patients with rectal cancer resections (RCR).


Using a previously published coding algorithm, we re-created a RCR cohort within administrative databases, limiting the search to a single institution. The accuracy of this cohort was determined against a gold standard reference population. A systematic review of the literature was then performed to identify studies that use similar coding methods to identify RCR cohorts and whether or not they comment on accuracy.


Over the course of the study period, there were 664,075 hospitalizations at our institution. Previously used coding algorithms identified 1131 RCRs (administrative data incidence 1.70 per 1000 hospitalizations). The gold standard reference population was 821 RCR over the same period (1.24 per 1000 hospitalizations). Administrative data methods yielded a RCR cohort of moderate accuracy (sensitivity 89.5%, specificity 99.9%) and poor positive predictive value (64.9%). Literature search identified 18 studies that utilized similar coding methods to derive a RCR cohort. Only 1/18 (5.6%) reported on the accuracy of their study cohort.


The use of diagnostic and procedure codes to identify RCR within administrative datasets may be subject to misclassification bias because of low PPV. This underscores the importance of reporting on the accuracy of RCR cohorts derived within population-based datasets.


Rectal cancer Administrative data 


Authors’ Contributions

All authors contributed to the concept and design of the study, as well as drafting and revising of the manuscript. All authors gave final approval for the manuscript. Authors Musselman, Gomes, Rothwell, Auer, and vanWalraven were directly involved in data collection and analysis.


This study was supported by the Institute for Clinical Evaluative Sciences (ICES), which is funded by an annual grant from the Ontario Ministry of Health and Long-Term Care (MOHLTC). The opinions, results, and conclusions reported in this paper are those of the authors and are independent from the funding sources. No endorsement by ICES or the Ontario MOHLTC is intended or should be inferred. Parts of this material are based on data and information compiled and provided by Canadian Institute for Health Information (CIHI). However, the analyses, conclusions, opinions, and statements expressed herein are those of the author, and not necessarily those of CIHI.

Compliance with Ethical Standards

Conflict of Interest

The authors declare that they have no conflicts of interest.


  1. 1.
    Rodriguez B, Balla A, Fernandez-Ananin S, Balague C, Targarona EM. The era of large databases: outcomes after gastroesophageal surgery according to NSQIP, NIS and NCDB databases. Systematic literature review. Surg Innov. 2018; 25(4):400–412.CrossRefGoogle Scholar
  2. 2.
    Rodriguez B, Balla A, Corradetti S et al. What have we learned in minimally invasive surgery from NSQIP and NIS large databases? A systematic review. Int J Colorectal Dis. 2018; 33(6): 663–681CrossRefGoogle Scholar
  3. 3.
    Cook JA, Collins GS. The rise of big clinical databases. Br J Surg, 2015; 102(2): e93-e101.CrossRefGoogle Scholar
  4. 4.
    van Walraven C, Bennett C, Forster AJ. Administrative database research infrequently uses validated diagnostic or procedural codes. J Clin Epidemiol. 2011; 64:1054–1059.CrossRefGoogle Scholar
  5. 5.
    Campbell SE, Campbell MK, Grimshaw JM, Walker AE. A systematic review of discharge coding accuracy. J Public Health Med 2001;23:205–211.CrossRefGoogle Scholar
  6. 6.
    Peabody JW, Luck J, Jain S, Bertenthal D, Glassman P. Assessing the accuracy of administrative data in health information systems. Med Care 2004;42:1066–1072.CrossRefGoogle Scholar
  7. 7.
    Hohl CM, Karpov AK, Reddekopp L, Stausberg J. ICD-10 codes used to identify adverse drug events in administrative data: a systematic review. J Am Med Inform Assoc 2014; 21:547–557.CrossRefGoogle Scholar
  8. 8.
    Prentice RL. Surrogate endpoints in clinical trials: definition and operational criteria. Statistics in Medicine 1989, v8 n4, 431–440.Google Scholar
  9. 9.
    Musselman RP, Gomes T, Chan BP, Auer RC, Moloo H, Poulin EC, Mamazza J, Al-Khayal K, Mamdani M, Al-Omran M, Al-Obaid O, Boushey RP. Changing Trends in Rectal Cancer Surgery in Ontario: 2002-2009. Colorectal Disease. 2012; 14(12):1467–1472.CrossRefGoogle Scholar
  10. 10.
    O’Connell JB, Maggard MA, Ko CY. Colon cancer survival rates with the new American Joint Committee on cancer sixth edition staging. Journal of the National Cancer Institute. 2004; 96(19): 1420–1425.CrossRefGoogle Scholar
  11. 11.
    AJCC (American Joint Committee on Cancer) Cancer Staging Manual, 7th edition, Edge, SB, Byrd, DR, Compton, CC, et al (Eds), Springer, New York 2010. p.143Google Scholar
  12. 12.
    Wong R, Berry S, Spithoff K et al. Preoperative or Postoperative Therapy for the Management of Patients with Stage II or III rectal cancer: Guideline Recommendations. July, 2008. Available at Accessed 08/07, 2013.
  13. 13.
    Tjandra JJ, Kilkenny JW, Buie WD et al. Practice parameters for the management of rectal cancer (revised). Dis Colon rectum. 2005; 48:411–423.CrossRefGoogle Scholar
  14. 14.
    Musselman RP, Rothwell D, Auer RC, Moloo H, Boushey R, van Walraven C. Identifying rectal cancer patients within administrative datasets using pathology reports. J Pathol Inform 2018; 9:18 (ePub ahead of print).CrossRefGoogle Scholar
  15. 15.
    Juurlink D, Preyra C, Croxford R et al. Canadian Institute for Health Information Discharge Abstract Database: a validation study. Toronto: Institute for Clinical Evaluative Sciences; 2006.Google Scholar
  16. 16.
    Ackerman SJ, Shoshana D, Baik R et al. Comparison of complication and conversion rates between robotic-assisted and laparoscopic rectal resection for rectal cancer: which patients and providers could benefit most from robotic-assisted surgery?. J Med Econ. 2018; 21(3): 254–261CrossRefGoogle Scholar
  17. 17.
    Saia M, Buja A, Mantoan D et al. Isolated rectal cancer surgery: a 2007-2014 population study based on a large administrative database. Updates Surg. 2017; 69: 367–373CrossRefGoogle Scholar
  18. 18.
    Yeo HL, Abelson JS, Mao J et al. Surgeon annual and cumulative volumes predict early postoperative outcomes after rectal cancer resection. Ann Surg. 2017;265: 151–157CrossRefGoogle Scholar
  19. 19.
    Keller D, Qiu J, Senagore AJ. Predicting opportunities to increase utiliziation of laparoscopy for rectal cancer. Surg Endosc. 2018; 32: 1556–1563.CrossRefGoogle Scholar
  20. 20.
    Aquina CT, Probst CP, Becerra BA et al. High volume improves outcomes: the argument for centralization of rectal cancer surgery. Surgery 2016;159: 736–748.CrossRefGoogle Scholar
  21. 21.
    Wiegard NE, Hart KD, Herzig DO. Psychiatric illness is a disparity in the surgical management of rectal cancer. Ann Surg Oncol 2015;22: S573-S579.CrossRefGoogle Scholar
  22. 22.
    Dobbins TA, Young JM, Solomon MJ. Uptake and outcomes of laparoscopically assisted resection for colon and rectal cancer in Australia: a population-based study. Dis Colon Rectum 2014;57:415–422CrossRefGoogle Scholar
  23. 23.
    Simunovic M, Baxter NN, Sutradhar R, Liu N, Cadeddu M, Urbach D. Uptake and patient outcomes of laparoscopic colon and rectal cancer surgery in a publicly funded system and following financial incentives. Ann Surg Oncol. 2013; 20: 3740–3746.CrossRefGoogle Scholar
  24. 24.
    Devon KM, Urbach DR, McLeod RS. Postoperative disposition and health services use in elderly patients undergoing colorectal cancer surgery: a population-based study. Surgery 2011; 149: 705–712.CrossRefGoogle Scholar
  25. 25.
    Ricciardi R, Roberts PL, Read TE, Baxter NN, Marcello PW, Schoetz DJ. Who performs proctectomy for rectal cancer in the United States. Dis Colon Rectum 2011; 54;1210–1215.CrossRefGoogle Scholar
  26. 26.
    Ricciardi R, Roberts PL, Read TE, Baxter NN, Marcello PW, Schoetz DJ. Presence of specialty surgeons reduces the likelihood of colostomy after proctectomy for rectal cancer. Dis Colon Rectum. 2011; 54;207–213.CrossRefGoogle Scholar
  27. 27.
    Ricciardi R, Roberts PL, Read TE, Marcello PW, Schoetz DJ, Baxter NN. Variability in reconstructive procedures following rectal cancer surgery in the United States. Dis Colon Rectum 2010; 53;874–880.CrossRefGoogle Scholar
  28. 28.
    Thompson BS, Coory MD, Lumley JW. National trends in the uptake of laparoscopic resection for colorectal cancer, 2000-2008. MJA 2011; 194: 443–447Google Scholar
  29. 29.
    Tilney HS, Heriot AG, Purkayastha et al. A national perspective on the decline of abdominoperineal resection for rectal cancer. Ann Surg. 2008;247(1): 77–84.CrossRefGoogle Scholar
  30. 30.
    Singh RK, Dharmasena D, Virgo K, et al. Proctectomy for rectal cancer in patients with prior spinal cord injury. Surg Oncol. 2008;17:313–316.CrossRefGoogle Scholar
  31. 31.
    Ricciardi R, Vernig BA, Madoff RD, Rothenberger DA, Baxter NN. The status of radical proctectomy and sphincter sparing surgery in the United States. Dis Colon Rectum. 2007; 50: 1119–1127.CrossRefGoogle Scholar
  32. 32.
    Morris E, Quirke P, Thomas JD et al. Unacceptable variation in abdominoperineal excision rates for rectal cancer: time to intervene?. Gut 2008; 57:1690–1697.CrossRefGoogle Scholar
  33. 33.
    Widdifield J, Bombardier C, Bernatsky S et al. An administrative data validation study of the accuracy of algorithms for identifying rheumatoid arthritis: the influence of the reference standard on algorithm performance. BMC Musculoskelet Disord. 2014; 15:216–225.CrossRefGoogle Scholar
  34. 34.
    Lee DS, Stitt A, Wang X et al. Administrative hospitalization database validation of cardiac procedure codes. Med Care. 2013; 51(4):e22–26.CrossRefGoogle Scholar
  35. 35.
    Tu K, Nieuwlaat R, Cheng S et al. Identifying patients with atrial fibrillation in administrative data. Canadian Journal of Cardiology. 2016; 32:1561–1565.CrossRefGoogle Scholar
  36. 36.
    Goldsbury DE, Armstrong K, Simonella L, Armstrong BK, O’Connell DL. Using administrative health data to describe colorectal and lung cancer in New South Wales, Australia: a validation study. BMC Health Services Research. 2012; 12:387–396.CrossRefGoogle Scholar
  37. 37.
    The Center for Administrative Data Research. Updated 2018. Accessed on 05/23, 2018.
  38. 38.
    Li X, King C, deGara C, White J, Winget M. Validation of colorectal cancer surgery data from administrative data sources. BMC Medical Research Methodology. 2012; 12:97–104.CrossRefGoogle Scholar
  39. 39.
    Savova GK, Masanz JJ, Ogren PV et al. Mayo clinical Text Analysis and Knowledge Extraction System (cTAKES): architecture, component evaluation and applications. J AM Med Inform Assoc 2010; 17(5): 507–513CrossRefGoogle Scholar

Copyright information

© The Society for Surgery of the Alimentary Tract 2018

Authors and Affiliations

  • Reilly P. Musselman
    • 1
    Email author
  • Tara Gomes
    • 2
  • Deanna M. Rothwell
    • 3
    • 4
  • Rebecca C. Auer
    • 1
  • Husein Moloo
    • 1
  • Robin P. Boushey
    • 1
  • Carl van Walraven
    • 2
    • 3
  1. 1.Division of General SurgeryUniversity of OttawaOttawaCanada
  2. 2.Institite for Clinical and Evaluative SciencesTorontoCanada
  3. 3.Ottawa Hospital Research InstituteOttawaCanada
  4. 4.The Ottawa HospitalOttawaCanada

Personalised recommendations