Optimal healthcare decision making under multiple mathematical models: application in prostate cancer screening


Important decisions related to human health, such as screening strategies for cancer, need to be made without a satisfactory understanding of the underlying biological and other processes. Rather, they are often informed by mathematical models that approximate reality. Often multiple models have been made to study the same phenomenon, which may lead to conflicting decisions. It is natural to seek a decision making process that identifies decisions that all models find to be effective, and we propose such a framework in this work. We apply the framework in prostate cancer screening to identify prostate-specific antigen (PSA)-based strategies that perform well under all considered models. We use heuristic search to identify strategies that trade off between optimizing the average across all models’ assessments and being “conservative” by optimizing the most pessimistic model assessment. We identified three recently published mathematical models that can estimate quality-adjusted life expectancy (QALE) of PSA-based screening strategies and identified 64 strategies that trade off between maximizing the average and the most pessimistic model assessments. All prescribe PSA thresholds that increase with age, and 57 involve biennial screening. Strategies with higher assessments with the pessimistic model start screening later, stop screening earlier, and use higher PSA thresholds at earlier ages. The 64 strategies outperform 22 previously published expert-generated strategies. The 41 most “conservative” ones remained better than no screening with all models in extensive sensitivity analyses. We augment current comparative modeling approaches by identifying strategies that perform well under all models, for various degrees of decision makers’ conservativeness.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4


  1. 1.

    Albertsen P, Hanley J, Fine J (2005) 20-year outcomes following conservative management of clinically localized prostate cancer. JAMA 293(17):2095–2101

    Article  Google Scholar 

  2. 2.

    Andriole GL, Crawford ED, Grubb RL, Buys SS, Chia D, Church TR, Fouad MN, Gelmann EP, Kvale PA, Reding DJ, Weissfeld JL, Yokochi LA, O’Brien B, Clapp JD, Rathmell JM, Riley TL, Hayes RB, Kramer BS, Izmirlian G, Miller AB, Pinsky PF, Prorok PC, Gohagan JK, Berg CD (2009) Mortality results from a randomized prostate-cancer screening trial. N Engl J Med 360(13):1310–1319

    Article  Google Scholar 

  3. 3.

    Arias E (2010) United States life tables, 2006. Natl Vital Stat Rep 58(21):1–40

    Google Scholar 

  4. 4.

    Aus G, Robinson D, Rosell J, Sandblom G, Varenhorst E (2005) Survival in prostate carcinoma—outcomes from a prospective, population-based cohort of 8887 men with up to 15 years of follow-up. Cancer 103(5):943–951

    Article  Google Scholar 

  5. 5.

    Bertsimas D, Tsitsiklis J (1997) Introduction to linear optimization. Athena Scientific

  6. 6.

    Bojke L, Claxton K, Sculpher M, Palmer S (2009) Characterizing structural uncertainty in decision analytic models: a review and application of methods. Value Health 12(5):739–749

    Article  Google Scholar 

  7. 7.

    Briggs AH, Weinstein MC, Fenwick EA, Karnon J, Sculpher MJ, Paltiel AD (2012) Model parameter estimation and uncertainty analysis: a report of the ISPOR-SMDM Modeling Good Research Practices Task Force Working Group-6. Med Decis Making 32(5):722–732

  8. 8.

    Bubendorf L, Schöpfer A, Wagner U, Sauter G, Moch H, Willi N, Gasser T, Mihatsch M (2000) Metastatic patterns of prostate cancer: an autopsy study of 1,589 patients. Hum Pathol 31(5):578–583

    Article  Google Scholar 

  9. 9.

    U.S. Cancer Statistics Working Group (2016) United States Cancer Statistics: 1999 – 2013 Incidence and Mortality Web-based Report. http://www.cdc.gov/uscs Accessed on July 31, 2016

  10. 10.

    Cuzick J, Thorat MA, Andriole G, Brawley OW, Brown PH, Culig Z, Eeles RA, Ford LG, Hamdy FC, Holmberg L, Ilic D, Key TJ, La Vecchia C, Lilja H, MarbergerM,Meyskens FL, Minasian LM, Parker C, Parnes HL, Perner S, Rittenhouse H, Schalken J, Schmid HP, Schmitz-Dräger BJ, Schröder FH, Stenzl A, Tombal B, Wilt TJ, Wolk A (2014) Prevention and early detection of prostate cancer. Lancet Oncol 15(11):E484–E492

  11. 11.

    Dowdy DW, Houben R, Cohen T, Pai M, Cobelens F, Vassall A, Menzies NA, Gomez GB, Langley I, Squire SB, White R (2014) Impact and cost-effectiveness of current and future tuberculosis diagnostics: the contribution of modelling. Int J Tuberc Lung Dis 18(9):1012–1018

    Article  Google Scholar 

  12. 12.

    Draisma G, Etzioni R, Tsodikov A, Mariotto A, Wever E, Gulati R, Feuer E, de Koning H (2009) Lead time and overdiagnosis in prostate-specific antigen screening: Importance of methods and context. J Natl Cancer Inst 101(6):374–383

    Article  Google Scholar 

  13. 13.

    Draper D (1995) Assessment and propagation of model uncertainty. J R Stat Soc Series B Stat Methodol 57(1):45–97

  14. 14.

    Eaton JW, Menzies NA, Stover J, Cambiano V, Chindelevitch L, Cori A, Hontelez JA, Humair S, Kerr CC, Klein DJ, Mishra S, Mitchell KM, Nichols BE, Vickerman P, Bakker R, Bärnighausen T, Bershteyn A, Bloom DE, Boily MC, Chang ST, Cohen T, Dodd PJ, Fraser C, Gopalappa C, Lundgren J, Martin NK, Mikkelsen E, Mountain E, Pham QD, Pickles M, Phillips A, Platt L, Pretorius C, Prudden HJ, Salomon JA, van de Vijver DA, de Vlas SJ, Wagner BG, White RG, Wilson DP, Zhang L, Blandford J, Meyer-Rath G, Remme M, Revill P, Sangrujee N, Terris-Prestholt F, Doherty M, Shaffer N, Easterbrook PJ, Hirnschall G, Hallett TB (2014) Health benefits, costs, and cost-effectiveness of earlier eligibility for adult antiretroviral therapy and expanded treatment coverage: a combined analysis of 12 mathematical models. Lancet Glob Health 2(1):e23–34

    Article  Google Scholar 

  15. 15.

    Eddy DM (1980) Screening for cancer: theory, analysis, and design. Prentice Hall

  16. 16.

    Eddy DM, Hollingworth W, Caro JJ, Tsevat J, McDonald KM, Wong JB (2012) Model transparency and validation: A report of the ISPOR-SMDM Modeling Good Research Practices Task Force-7. Value Health 15(6):843–850

  17. 17.

    Etzioni R, Gulati R (2013) Response: Reading between the lines of cancer screening trials: Using modeling to understand the evidence. Med Care 51(4):304–306

    Article  Google Scholar 

  18. 18.

    Etzioni R, Tsodikov A, Mariotto A, Szabo A, Falcon S, Wegelin J, diTommaso D, Karnofski K, Gulati R, Penson DF, Feuer E (2008) Quantifying the role of PSA screening in the US prostate cancer mortality decline. Cancer Causes Control 19(2):175–181

    Article  Google Scholar 

  19. 19.

    Ferlay J, Soerjomataram I, Ervik M, Dikshit R, Eser S, Mathers C, Rebelo M, Parkin DM, Forman D, Bray F (2014) GLOBOCAN 2012 v1.1, Cancer Incidence and Mortality Worldwide: IARC CancerBase No. 11. Lyon, France: International Agency for Research on Cancer. http://globocan.iarc.fr Accessed on July 31, 2016

  20. 20.

    Greco S, Ehrgott M, Figueira JR, (eds.) (2014) Multiple criteria decision analysis. State of the art surveys. Springer

  21. 21.

    Gilboa I, Schmeidler D (1989) Maxmin expected utility with a non-unique prior. J Math Econom 18(2):141–153

  22. 22.

    Ghani KR, Grigor K, Tulloch DN, Bollina PR, McNeill SA (2005) Trends in reporting gleason score 1991 to 2001: changes in the pathologist’s practice. Eur Urol 47(2):196–201

    Article  Google Scholar 

  23. 23.

    Ghirardato P, Maccheroni F, Marinacci M (2004) Differentiating ambiguity and ambiguity attitude. J Econ Theory 118(2):133–173

    Article  Google Scholar 

  24. 24.

    Gulati R, Gore JL, Etzioni R (2013) Comparative effectiveness of alternative PSA-based prostate cancer screening strategies. Ann Intern Med 158(3):145–153

    Article  Google Scholar 

  25. 25.

    Gulati R, Tsodikov A, Wever EM, Mariotto AB, Heijnsdijk EAM, Katcher J, de Koning HJ, Etzioni R (2012) The impact of PLCO control arm contamination on perceived PSA screening efficacy. Cancer Causes Control 23(6):827–835

    Article  Google Scholar 

  26. 26.

    Haas GP, Delongchamps NB, Jones RF, Chandan V, Serio AM, Vickers AJ, Jumbelic M, Threatte G, Korets R, Lilja H, de la Roza G (2007) Needle biopsies on autopsy prostates: Sensitivity of cancer detection based on true prevalence. J Natl Cancer Inst 99(19):1484–1489

  27. 27.

    Habbema JDF, Schechter CB, Cronin KA, Clarke LD, Feuer EJ (2006) Modeling cancer natural history, epidemiology, and control: reflections on the CISNET breast group experience. J Natl Cancer Inst Monogr 2006(36):122–126

    Article  Google Scholar 

  28. 28.

    Center for the Evaluation of Value and Risk in Health (2015) The Cost-Effectiveness Analysis Registry. Institute for Clinical Research and Health Policy Studies, Tufts Medical Center, Boston, MA, US. www.cearegistry.org. Accessed on October 03, 2015

  29. 29.

    Heijnsdijk EA, Wever EM, Auvinen A, Hugosson J, Ciatto S, Nelen V, Kwiatkowski M, Villers A, Páez A, Moss SM, Zappa M, Tammela TL, Mäkinen T, Carlsson S, Korfage IJ, Essink-Bot ML, Otto SJ, Draisma G, Bangma CH, Roobol MJ, Schröder FH, de Koning HJ (2012) Quality-of-life effects of prostate-specific antigen screening. N Engl J Med 367(7):595–605

    Article  Google Scholar 

  30. 30.

    Ilic D, Neuberger MM, Djulbegovic M, Dahm P (2013) Screening for prostate cancer. Cochrane Database Syst Rev (1):CD004720

    Google Scholar 

  31. 31.

    Kjellman A, Akre O, Norming U, Törnblom M, Gustafsson O (2009) 15-year followup of a population based prostate cancer screening study. J Urol 181(4):1615–1621

    Article  Google Scholar 

  32. 32.

    Kobayashi T, Goto R, Ito K, Mitsumori K (2007) Prostate cancer screening strategies with re-screening interval determined by individual baseline prostate-specific antigen values are cost-effective. Eur J Surg Oncol 33 (6):783–789

    Article  Google Scholar 

  33. 33.

    Kong CY, Kroep S, Curtius K, Hazelton WD, Jeon J, Meza R, Heberle CR, Miller MC, Choi SE, Lansdorp-Vogelaar I, van Ballegooijen M, Feuer EJ, Inadomi JM, Hur C, Luebeck EG (2014) Exploring the recent trend in esophageal adenocarcinoma incidence and mortality using comparative simulation modeling. Cancer Epidemiol Biomarkers Prev 23(6):997–1006

    Article  Google Scholar 

  34. 34.

    de Koning HJ, Meza R, Plevritis SK, ten Haaf K, Munshi VN, Jeon J, Erdogan SA, Kong CY, Han SS, van Rosmalen J, Choi SE, Pinsky PF, de Gonzalez AB, Berg CD, Black WC, Tammemägi MC, Hazelton WD, Feuer EJ, McMahon PM (2014) Benefits and harms of computed tomography lung cancer screening strategies: A comparative modeling study for the U.S. Preventive Services Task Force. Ann Intern Med 160(5):311–320

    Article  Google Scholar 

  35. 35.

    Krahn MD, Mahoney JE, Eckman MH, Trachtenberg J, Pauker SG, Detsky AS (1994) Screening for prostate cancer: a decision analytic view. JAMA 272(10):773–780

  36. 36.

    Kuntz KM, Lansdorp-Vogelaar I, Rutter CM, Knudsen AB, van Ballegooijen M, Savarino JE, Feuer EJ, Zauber AG (2011) A systematic comparison of microsimulation models of colorectal cancer: the role of assumptions about adenoma progression. Med Decis Making 31(4):530–539

    Article  Google Scholar 

  37. 37.

    Labrie F, Candas B, Cusan L, Gomez JL, Bélanger A, Brousseau G, Chevrette E, Lévesque J (2004) Screening decreases prostate cancer mortality: 11-year follow-up of the 1988 Quebec prospective randomized controlled trial. Prostate 59(3):311–318

    Article  Google Scholar 

  38. 38.

    Lee SJ, Zelen M (2002) Statistical models for screening: planning public health programs. In: Beam C (ed) Biostatistical applications in cancer research. Springer, US, pp 19–36

    Google Scholar 

  39. 39.

    Mandelblatt JS, Cronin KA, Bailey S, Berry DA, de Koning HJ, Draisma G, Huang H, Lee SJ, Munsell M, Plevritis SK, Ravdin P, Schechter CB, Sigal B, Stoto MA, Stout NK, van Ravesteyn NT, Venier J, Zelen M, Feuer EJ (2009) Effects of mammography screening under different screening schedules: Model estimates of potential benefits and harms. Ann Intern Med 151(10):738–747

    Article  Google Scholar 

  40. 40.

    Messing EM, Manola J, Yao J, Kiernan M, Crawford D, Wilding G, di’SantAgnese PA, Trump D (2006) Immediate versus deferred androgen deprivation treatment in patients with node-positive prostate cancer after radical prostatectomy and pelvic lymphadenectomy. Lancet Oncol 7(6):472–479

    Article  Google Scholar 

  41. 41.

    National Academies of Science (2012) Assessing the reliability of complex models: Mathematical and statistical foundations of verification, validation and uncertainty quantification. http://www.nap.edu/catalog/13395/assessing-the-reliability-of-complex-models-mathematical-and-statisticalfoundations

  42. 42.

    National Cancer Institute (2008) Surveillance epidemiology and end results. http://seer.cancer.gov

  43. 43.

    Oesterling JE, Jacobsen SJ, Chute CG, Guess HA, Girman CJ, Panser LA, Lieber MM (1993) Serum prostate-specific antigen in a community-based population of healthy men. Establishment of age-specific reference ranges. JAMA 270(7):860–864

    Article  Google Scholar 

  44. 44.

    Ross KS, Carter HB, Pearson JD, Guess HA (2000) Comparative efficiency of prostate-specific antigen screening strategies for prostate cancer detection. JAMA 284(11):1399–1405

    Article  Google Scholar 

  45. 45.

    Sandblom G, Varenhorst E, Rosell J, Löfman O, Carlsson P (2011) Randomised prostate cancer screening trial: 20 year follow-up. BMJ 342:d1539

    Article  Google Scholar 

  46. 46.

    Scardino PT, Beck JR, Miles BJ (1994) Conservative management of prostate cancer. N Engl J Med 330(25):1831

    Google Scholar 

  47. 47.

    Schröder FH, Hugosson J, Roobol MJ, Tammela TL, Ciatto S, Nelen V, Kwiatkowski M, Lujan M, Lilja H, Zappa M, Denis LJ, Recker F, Berenguer A, Määttänen L, Bangma CH, Aus G, Villers A, Rebillard X, van der Kwast T, Blijenberg BG, Moss SM, de Koning HJ, Auvinen A (2009) Screening and prostate-cancer mortality in a randomized European study. N Engl J Med 360(13):1320–1328

    Article  Google Scholar 

  48. 48.

    Tsodikov A, Szabo A, Wegelin J (2006) A population model of prostate cancer incidence. Stat Med 25 (16):2846–2866

    Article  Google Scholar 

  49. 49.

    Underwood DJ, Zhang J, Denton BT, Shah ND, Inman BA (2012) Simulation optimization of PSA-threshold based prostate cancer screening policies. Health Care Manag Sci 15(4):293–309

    Article  Google Scholar 

  50. 50.

    Weinstein MC, O’Brien B, Hornberger J, Jackson J, Johannesson M, McCabe C, Luce BR (2003) Principles of good practice for decision analytic modeling in health-care evaluation: report of the ISPOR Task Force on Good Research Practices–Modeling Studies. Value Health 6(1):9–17

  51. 51.

    Zauber AG, Lansdorp-Vogelaar I, Knudsen AB, Wilschut J, van Ballegooijen M, Kuntz KM (2008) Evaluating test strategies for colorectal cancer screening: A decision analysis for the U.S. Preventive Services Task Force. Ann Intern Med 149(9):659–669

    Article  Google Scholar 

  52. 52.

    Zelen M, Feinleib M (1969) On the theory of screening for chronic diseases. Biometrika 56(3):601–614

    Article  Google Scholar 

  53. 53.

    Zhang J, Denton B, Balasubramanian H, Shah N, Inman B (2012) Optimization of PSA screening policies: a comparison of the patient and societal perspectives. Med Decis Making 32(2):337–349

    Article  Google Scholar 

Download references

Author information



Corresponding author

Correspondence to John Silberholz.


Appendix A: Literature Review

We searched PubMed (January 1, 2010, through October 3, 2015) using the following query:


We also searched the Tufts Cost-Effectiveness Analysis Registry [28] (from inception to October 3, 2015) for the term “prostate”. Two citations were retrieved in full text, but were also identified in the PubMed searches. Fig. 5 shows the results of searches and reasons for exclusion.

Fig. 5

Literature identification

One modification was required to use Model Z [53] to assess an arbitrary PSA-based screening strategy. Given a patient’s cancer status, Model Z assigns a probability that the patient will have a PSA value in the ranges [0,1), [1,2.5), [2.5,4), [4,7), [7,10), and [10,). We assume all PSA values in a range are equally likely to occur, and we limit to PSA values between 10 n g/m L and 20 n g/m L for the highest range.

Appendix B: Sensitivity Analysis

We varied parameters based on sensitivity ranges used in the papers describing each model. The variables names are from the respective papers.

B.1: Model G [24], parameters governing the course of the disease

We varied each of the following five parameters to the maximum and minimum value in the 100 sets of sensitivity parameters used in Gulati et al. [24].

  • grade.onset.rate: A rate controlling how quickly patients experience prostate cancer onset.

  • grade.metastasis.rate: A rate controlling how quickly patients with undetected prostate cancer experience metastasis.

  • grade.clinical.rate.baseline: A rate controlling how quickly a patient’s cancer is clinically detected.

  • grade.clinical.rate.distant: A rate controlling how quickly a patient’s cancer is clinically detected after metastasis.

  • grade.clinical.rate.high:: A rate controlling how quickly a patient’s high-grade cancer is clinically detected.

  • low.grade.slope: A parameter controlling the likelihood that a patient who developed cancer has a low-grade cancer.

B.2: Model U [49], parameters governing the course of the disease

We varied parameters using sensitivity ranges from [49].

  • d t : The rate of other-cause (non-prostate cancer) mortality at age t was varied ±20 % from the base-case parameter values from [3, 42].

  • w t : The prostate cancer incidence rate for a man at age t was varied using sensitivity ranges from [8]. For patients aged 40–49, the sensitivity range was defined as [0.00020, 0.00501]; for patients aged 50–59, the sensitivity range was defined as [0.00151, 0.00491]; for patients aged 60–69, the sensitivity range was defined as [0.00243, 0.00852]; for patients aged 70–79, the sensitivity range was defined as [0.00522, 0.01510]; and for patients aged 80 or more, the sensitivity range was defined as [0.00712, 0.01100].

  • b t : The annual probability of metastasis among patients with detected cancer treated with radical prostatectomy was varied ±20 % from the base-case parameter value of 0.006 derived from the Mayo Clinic Radical Prostatectomy Repository.

  • e t : The annual probability of metastasis among patients with undetected cancer was varied ±20 % from the base-case parameter value of 0.069 from [22, 46].

  • z t : The annual probability of dying from prostate cancer among men aged t with metastatic disease was varied using the sensitivity range [0.07, 0.37] from [4, 40] around the base-case values of 0.074 for patients aged 40–64 and 0.070 for patients aged 75 and older [42].

  • f: The probability of a biopsy detecting cancer in a patient with prostate cancer was varied ±20 % from its base-case value of 0.8 from [26].

B.3: Model Z [53], parameters governing the course of the disease

The sensitivity analyses in [53] did not vary any parameters governing the course of the disease, and pertained only to costs and literature-derived quality-of-life decrements. Because of the similarities with model U, we used the sensitivity ranges from model U for model Z’s d t ,w t , and f parameters, additionally varying the following parameters:

  • b t : The annual probability of a man of age t with detected prostate cancer treated with radical prostatectomy dying of the disease was varied ±20 % from its base-case value of 0.0067 for men aged 40–64 and 0.0092 for men aged 65 and older [42].

  • e t : The annual probability of a man of age t with undetected prostate cancer dying of the disease was varied ±20 % from its base-case value of 0.033 from [1].

B.4: All models, quality-of-life decrements

Each model in this work uses the literature review-based quality-of-life decrements (utility weights) from a re-analysis of the ERSPC study from [29]. The sensitivity analysis ranges used in that work are as follows:

  • Screening attendance: The utility estimate for the week following screening was varied in range [0.99, 1.00] from base estimate 0.99.

  • Biopsy: The utility estimate for the three weeks following biopsy was varied in range [0.87, 0.94] from base estimate 0.90.

  • Cancer diagnosis: The utility estimate for the month following cancer diagnosis was varied in range [0.75, 0.85] from base estimate 0.80.

  • Radiation therapy: The utility estimate for the first two months after radiation therapy was varied in range [0.71, 0.91] from base estimate 0.73, and the utility estimate for the next 10 months after radiation therapy was varied in range [0.61, 0.88] from base estimate 0.78.

  • Radical prostatectomy: The utility estimate for the first two months after radical prostatectomy was varied in range [0.56, 0.90] from base estimate 0.67, and the utility estimate for the next 10 months after radical prostatectomy was varied in range [0.70, 0.91] from base estimate 0.77.

  • Active surveillance: The utility estimate for the first seven years of active surveillance was varied in range [0.85, 1.00] from base estimate 0.97.

  • Postrecovery period: The utility estimate for years 1–10 following radical prostatectomy or radiation therapy was varied in range [0.93, 1.00] from base estimate 0.95.

  • Palliative therapy: The utility estimate during 30 months of palliative therapy was varied in range [0.24, 0.86] from base estimate 0.60.

  • Terminal illness: The utility estimate during six months of terminal illness was varied in range [0.24, 0.40] from base estimate 0.40.

Appendix C: Building an Efficient Frontier of Screening Strategies

Given a screening strategy s, let A(s) be the average assessment of the strategy across all mathematical models and let P(s) be the pessimistic assessment of the strategy across all mathematical models. To construct an efficient frontier of strategies trading off the average and most pessimistic assessment, we use mathematical optimization via an iterated local search heuristic to maximize the objective function λ A(s)+(1−λ)P(s) for λ∈{0,0.1,0.2,…,1.0} over annual screening strategies and biennial screening strategies, optimizing a total of 22 times. From the set of all screening strategies encountered during the optimization process (not just the final values identified through optimization), we construct an efficient frontier trading off the average and pessimistic assessments. Solutions encountered while optimizing the objective with parameter value λ using iterated local search may not be optimal for the objective with that λ but may still lie on the efficient frontier trading off the average and pessimistic assessments, so the final efficient frontier may contain more than 22 efficient strategies.

The key step in constructing the efficient frontier is solving maxsS λ A(s)+(1−λ)P(s), where S is the set of all feasible screening strategies. We consider strategies with age-specific PSA cutoffs limited to 0.5, 1.0, 1.5, …, 6.0 n g/m L, fixed cutoffs for 5-year age ranges, and cutoffs that are non-decreasing in a patient’s age. We consider screening from ages 40 through 99, so there are 10.4 million possible screening strategies; as a result, it would be time consuming to use enumeration to identify the strategy with the highest average incremental QALE compared to not screening. Instead, we use constrained iterated local search to identify a locally optimal strategy that cannot be improved by changing a single age-specific PSA threshold.

The central step in the iterated local search is the local search, which takes as input a screening strategy s and a single age range r and searches a small number of similar strategies to s. For each possible PSA threshold (0.5, 1.0, …, 6.0 n g/m L), the local search procedure constructs a new strategy by modifying s to use that threshold in age range r, additionally making the smallest possible changes to the remaining PSA thresholds in s to retain non-decreasing PSA thresholds in age. Each of these 12 screening strategies is evaluated, and if any improves over s then the one with the best objective value is selected to replace s. In the case where r is either the first or last age range in s in which patients screen, the procedure also considers a no-screening option for age range r.

As an example, consider a screening strategy for which annual screening is performed for ages 45–69, with cutoff 2.0 n g/m L from ages 45–49, 3.0 n g/m L from ages 50–54, and 5.0 n g/m L from ages 55–59, 60–64, and 64–69. We can write this screening strategy compactly as (2, 3, 5, 5, 5), with each value in the vector representing the cutoff for a 5-year period. If we apply local search to the cutoff for ages 55–59, then we will consider changing the cutoff for that age range to each value in {0.5, 1.0, 1.5, …, 6.0} n g/m L, adjusting other cutoffs the smallest amount possible to ensure all cutoffs are non-decreasing in age. For instance, if the cutoff for ages 55–59 were set to 2.5 n g/m L, then the cutoff for ages 50–54 would also need to be decreased to 2.5 n g/m L in order to maintain non-decreasing cutoffs, yielding final screening strategy (2, 2.5, 2.5, 5, 5). The set of all possible screening strategies considered by a local search on age range 55–59 is:

  • (0.5, 0.5, 0.5, 5, 5)

  • (1, 1, 1, 5, 5)

  • (1.5, 1.5, 1.5, 5, 5)

  • (2, 2, 2, 5, 5)

  • (2, 2.5, 2.5, 5, 5)

  • (2, 3, 3, 5, 5)

  • (2, 3, 3.5, 5, 5)

  • (2, 3, 4, 5, 5)

  • (2, 3, 4.5, 5, 5)

  • (2, 3, 5, 5, 5)

  • (2, 3, 5.5, 5.5, 5.5)

  • (2, 3, 6, 6, 6)

Among these strategies, the one resulting in the largest objective value λ A(s) + (1 − λ)P(s) is the one selected by the local search.

The iterated local search begins with a strategy of never screening for prostate cancer. The procedure repeatedly loops through a random permutation of the age ranges, performing local search on an age range if it’s within 5 years of an age range for which the current strategy screens with PSA. The procedure terminates when the current screening strategy cannot be improved by applying local search to any valid age range.

Appendix D: Details of Optimizing Screening Strategies

The iterated local search procedure was implemented in python. The C source code for model G and the C++ source code of model U were provided by the authors of those works; model U was re-implemented in python to improve the efficiency of the procedure. Model Z was implemented in python based on the published description of that model. All procedures were tested on a Dell Precision T7600 with 128 GB RAM and two Intel Xeon E5-2687W Processors, each with 8 cores and a clock speed of 3.1 GHz.

The runtime of the iterated local search procedure for each objective function is provided in Table 4.

Table 4 Computation time required for iterated local search procedure

To validate the performance of the local search optimization approach, we computed the exact optimal solution for models Z and U by evaluating all 5.2 million feasible biennial strategies and all 5.2 million feasible annual strategies with each model, a process that required 60.1 CPU hours for model Z and 369.1 CPU hours for model U. The local search heuristic had identified the global optimal solution for models Z and U. Given the heavy computational burden of evaluating strategies with model G, we did not compute exact optimal solutions for model G or for any of the objectives used to compute the efficient frontier.

Appendix E: Sensitivity Analysis: Model Averaging of Normalized Assessments

As a sensitivity analysis, we reproduced the efficient frontier using a normalized version of the objective function. For each model, we normalized the QALE change compared to not screening to have a maximum value of 1, ensuring that models with systematically more optimistic assessments of screening strategies are not weighted more heavily than others in the model averaging objective.

We computed an efficient frontier as before, trading off the average and most pessimistic assessment of the normalized objective function. The efficient frontier, single-model solutions, and expert strategies are plotted in Fig. 6.

Fig. 6

Average and most pessimistic assessments of identified and expert-generated screening strategies. The 6 strategies on the efficient frontier are shown as black circles. The optimal strategies according to models G (G-best), U (U-best), or Z (Z-best) are shown as red circles. The 22 expert-generated strategies are shown as gray circles. Normalized assessments of QALE over no screening with each model are shown in parentheses for some strategies. For example strategy EF-1 was assessed as 0.8 proportion of the maximum attainable QALE improvement by models G, U, and Z

The efficient frontier with the normalized objective function is qualitatively different from the efficient frontier with the non-normalized objective. No screening strategy optimized with a single model falls on the efficient frontier, and all strategies in the efficient frontier dominate the single-model and expert-generated strategies in both the most pessimistic and the average model assessments. The efficient frontier is smaller, comprising only six screening strategies, and the strategies in the frontier are more homogeneous. The strategy optimizing the most pessimistic assessment prescribes biennial screening with threshold 0.5 n g/m L from ages 40–54, 1.5 n g/m L from ages 55–64, 4.0 n g/m L from ages 65–69, and 5.0 n g/m L from ages 70–74. The strategy optimizing the average assessment is similar, prescribing biennial screening with threshold 0.5 n g/m L from ages 40–49, 1.0 n g/m L from ages 50–59, 1.5 n g/m L from ages 60–64, 2.0 n g/m L from ages 65–69, and 6.0 n g/m L from ages 70–79. For all six strategies on the efficient frontier model U was the most pessimistic in the normalized assessment.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Bertsimas, D., Silberholz, J. & Trikalinos, T. Optimal healthcare decision making under multiple mathematical models: application in prostate cancer screening. Health Care Manag Sci 21, 105–118 (2018). https://doi.org/10.1007/s10729-016-9381-3

Download citation


  • Comparative modeling
  • Decision analysis
  • Sensitivity analysis
  • Model averaging
  • Optimization
  • Prostate cancer screening
  • Simulation modeling