Skip to main content

Judgmental Bootstrapping: Inferring Experts’ Rules for Forecasting

  • Chapter
Principles of Forecasting

Part of the book series: International Series in Operations Research & Management Science ((ISOR,volume 30))

  • 3610 Accesses

Abstract

Judgmental bootstrapping is a type of expert system. It translates an expert’s rules into a quantitative model by regressing the expert’s forecasts against the information that he used. Bootstrapping models apply an expert’s rules consistently, and many studies have shown that decisions and predictions from bootstrapping models are similar to those from the experts. Three studies showed that bootstrapping improved the quality of production decisions in companies. To date, research on forecasting with judgmental bootstrapping has been restricted primarily to cross-sectional data, not time-series data. Studies from psychology, education, personnel, marketing, and finance showed that bootstrapping forecasts were more accurate than forecasts made by experts using unaided judgment. They were more accurate for eight of eleven comparisons, less accurate in one, and there were two ties. The gains in accuracy were generally substantial. Bootstrapping can be useful when historical data on the variable to be forecast are lacking or of poor quality; otherwise, econometric models should be used. Bootstrapping is most appropriate for complex situations, where judgments are unreliable, and where experts’ judgments have some validity. When many forecasts are needed, bootstrapping is cost-effective. If experts differ greatly in expertise, bootstrapping can draw upon the forecasts made by the best experts. Bootstrapping aids learning; it can help to identify biases in the way experts make predictions, and it can reveal how the best experts make predictions. Finally, judgmental bootstrapping offers the possibility of conducting “experiments” when the historical data for causal variables have not varied over time. Thus, it can serve as a supplement for econometric models.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 429.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 549.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 549.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  • Abdel-Khalik, A. R. and K. M. El-Sheshai (1980), “Information choice and utilization in an experiment on default prediction,” Journal of Accounting Research, 18, 325–342.

    Article  Google Scholar 

  • Allen, P. G. and R. Fildes (2001), “Econometric forecasting,” in J. S. Armstrong (ed.) Principles of Forecasting. Norwell, MA: Kluwer Academic Publishers.

    Google Scholar 

  • Arkes, H. R., R. M. Dawes and C. Christensen (1986), “Factors influencing the use of a decision rule in a probabilistic task,” Organizational Behavior and Human Decision Processes, 37, 93–110.

    Article  Google Scholar 

  • Armstrong, J. S. (1985), Long-Range Forecasting: From Crystal Ball to Computer (2nd ed.). New York: John Wiley. Full text at hops.wharton.upenn.edu/forecast.

    Google Scholar 

  • Armstrong, J. S. (1997), “Peer review for journals: Evidence on quality control, fairness, and innovation,” Science and Engineering Ethics, 3, 63–84. Full text at hops.wharton.upenn.edu/forecast.

    Article  Google Scholar 

  • Armstrong, J. S., M. Adya and F. Collopy (2001), “Rule-based forecasting: Using judgment in time-series extrapolation,” in J. S. Armstrong (ed.) Principles of Forecasting. Norwell, MA: Kluwer Academic Publishers.

    Google Scholar 

  • Armstrong, J. S. and A. Shapiro (1974), “Analyzing quantitative models,” Journal ofMarketing, 38, 61–66. Full text at hops.wharton.upenn.edu/forecast.

    Article  Google Scholar 

  • Ashton, A. H. (1985), “Does consensus imply accuracy in accounting studies of decision making” Accounting Review, 60, 173–185.

    Google Scholar 

  • Ashton, A. H., R. H. Ashton and M. N. Davis (1994), “White-collar robotics: Levering managerial decision making,” California Management Review, 37, 83–109.

    Article  Google Scholar 

  • Bowman, E. H. (1963), “Consistency and optimality in managerial decision making,” Management Science, 9, 310–321.

    Article  Google Scholar 

  • Camerer, C. (1981), “General conditions for the success of bootstrapping models,” Organizational Behavior and Human Performance, 27, 411–422.

    Article  Google Scholar 

  • Christal, R. E. (1968), “Selecting a harem and other applications of the policy-capturing model,” Journal of Experimental Education, 36 (Summer), 35–41.

    Google Scholar 

  • Collopy, F., M. Adya and J. S. Armstrong (2001), “Expert systems for forecasting,” in J. S. Armstrong (ed.) Principles of Forecasting. Norwell, MA: Kluwer Academic Publishers.

    Google Scholar 

  • Cook, R. L. and T. R. Stewart (1975), “A comparison of seven methods for obtaining subjective descriptions of judgmental policy,” Organizational Behavior and Human Performance, 13, 31–45.

    Article  Google Scholar 

  • Dawes, R. M. (1971), “A case study of graduate admissions: Application of three principles of human decision making,” American Psychologist, 26, 180–188.

    Article  Google Scholar 

  • Dawes, R. M. (1979), “The robust beauty of improper linear models in decision making,” American Psychologist, 34, 571–582.

    Article  Google Scholar 

  • Dawes, R. M. and B. Corrigan (1974), “Linear models in decision making,” Psychological Bulletin, 81, 95–106.

    Article  Google Scholar 

  • DeDombal, F. T. (1984), “Clinical decision making and the computer: Consultant, expert, or just another test” British Journal of Health Care Computing, 1, 7–12.

    Google Scholar 

  • DeVaul, R. A. et al. (1987), “Medical school performance of initially rejected students,” Journal of the American Medical Association, 257 (Jan 2), 47–51.

    Article  Google Scholar 

  • Diehl, E. and J. D. Sterman (1995), “Effects of feedback complexity on dynamic decision making,” Organizational Behavior and Human Decision Processes, 62, 198–215.

    Article  Google Scholar 

  • Dougherty, T. W., R. J. Ebert and J. C. Callender (1986), “Policy capturing in the employment interview,” Journal of Applied Psychology, 71, 9–15.

    Article  Google Scholar 

  • Ebert, R. J. and T. E. Kruse (1978), “Bootstrapping the security analyst,” Journal of Applied Psychology, 63, 110–119.

    Article  Google Scholar 

  • Einhorn, H. J., D. N. Kleinmuntz and B. Kleinmuntz (1979), “Linear regression and process-tracing models of judgment,” Psychological Review, 86, 465–485.

    Article  Google Scholar 

  • Ganzach, Y., A. N. Kluger and N. Klayman (2000), “Making decisions from an interview: Expert measurement and mechanical combination,” Personnel Psychology, 53, 1–20.

    Article  Google Scholar 

  • Goldberg, L. R. (1968), “Simple models or simple processes? Some research on clinical judgments,” American Psychologist, 23, 483–496.

    Article  Google Scholar 

  • Goldberg, L. R. (1970), “Man vs. model of man: A rationale, plus some evidence, for a method of improving on clinical inferences,” Psychological Bulletin, 73, 422–432.

    Article  Google Scholar 

  • Goldberg, L. R. (1971), “Five models of clinical judgment: An empirical comparison between linear and nonlinear representations of the human inference process,” Organizational Behavior and Human Performance, 6, 458–479.

    Article  Google Scholar 

  • Goldberg, L. R. (1976), “Man vs. model of man: Just how conflicting is that evidence?” Organizational Behavior and Human Performance, 16, 13–22.

    Article  Google Scholar 

  • Grove, W. M. and P. E. Meehl (1996), “Comparative efficiency of informal (subjective, impressionistic) and formal (mechanical, algorithmic) prediction procedures: The clinical-statistical controversy,” Psychology, Public Policy,and Law, 2, 293–323.

    Article  Google Scholar 

  • Hamm, R. H. (1991), “Accuracy of alternative methods for describing expert’s knowledge of multiple influence domains,” Bulletin of the Psychonomic Society, 29, 553–556.

    Google Scholar 

  • Heeler, R. M., M. J. Kearney and B. J. Mehaffey (1973), “Modeling supermarket product selection,” Journal of Marketing Research, 10, 34–37.

    Article  Google Scholar 

  • Hogarth, R. M. (1978), “A note on aggregating opinions,” Organizational Behavior and Human Performance, 21, 40–46.

    Article  Google Scholar 

  • Hughes, H. D. (1917), “An interesting seed corn experiment,” The Iowa Agriculturist, 17, 424–425,428.

    Google Scholar 

  • Johnson, E. (1988), “Expertise and decision under uncertainty: Performance and process,” in M. Chi, R. Glaser and M. Farr, (eds.), The Nature of Expertise. Mahwah, NJ: Lawrence Erlbaum Associates.

    Google Scholar 

  • Kleinmuntz, B. (1990), “Why we still use our heads instead of formulas: Toward an integrative approach,” Psychological Bulletin, 107, 296–310.

    Article  Google Scholar 

  • Kunreuther, H. (1969), “Extensions of Bowman’s theory on managerial decision-making,” Management Science, 15, 415–439.

    Article  Google Scholar 

  • Libby, R. (1976), “Man versus model of man: The need for a non-linear model,” Organizational Behavior and Human Performance, 16, 1–12.

    Article  Google Scholar 

  • Libby, R. and R. K. Blashfield (1978), “Performance of a composite as a function of the number of judges,” Organizational Behavior and Human Performance, 21, 121–129.

    Article  Google Scholar 

  • Martorelli, W.P. (1981), “Cowboy DP scouting avoids personnel fumbles,” Information Systems News, (November 16).

    Google Scholar 

  • McClain, J. O. (1972), “Decision modeling in case selection for medical utilization review,” Management Science, 18, B706 - B717.

    Article  Google Scholar 

  • Milstein, R. M. et al. (1981), “Admissions decisions and performance during medical school,” Journal of Medical Education, 56, 77–82

    Google Scholar 

  • Milstein, R. M. et al. (1980), “Prediction of interview ratings in a medical school admission process, Journal of Medical Education, 55, 451–453.

    Google Scholar 

  • Moskowitz, H. (1974), “Regression models of behavior for managerial decision making,” Omega, 2, 677–690.

    Article  Google Scholar 

  • Moskowitz, H. and J. G. Miller (1972), “Man, models of man or mathematical models for managerial decision making” Proceedings of the American Institute for Decision Sciences. New Orleans, pp. 849–856.

    Google Scholar 

  • Moskowitz, H., D. L. Weiss, K. K. Cheng and D. J. Reibstein (1982) “Robustness of linear models in dynamic multivariate predictions,” Omega, 10, 647–661.

    Article  Google Scholar 

  • Roebber, P. J. and L. F. Bosart (1996), “The contributions of education and experience to forecast skill,” Weather and Forecasting, 11, 21–40.

    Article  Google Scholar 

  • Roose, J. E. and M. E. Doherty (1976), “Judgment theory applied to the selection of life insurance salesmen,” Organizational Behavior and Human Performance, 16, 231–249.

    Article  Google Scholar 

  • Schmitt, N. (1978), “Comparison of subjective and objective weighting strategies in changing task situations,” Organizational Behavior and Human Performance, 21, 171–188.

    Article  Google Scholar 

  • Schneidman, E. S. (1971), “Perturbation and lethality as precursors of suicide in a gifted group,” Life-threatening Behavior, 1, 23–45.

    Google Scholar 

  • Simester, D. and R. Brodie (1993), “Forecasting criminal sentencing decisions,” International Journal of Forecasting, 9, 49–60.

    Article  Google Scholar 

  • Slovic, P., D. Fleissner and W. S. Bauman (1972), “Analyzing the use of information in investment decision making: A methodological proposal,” Journal of Business, 45, 283–301.

    Article  Google Scholar 

  • Stewart, T. R. (2001), “Improving reliability of judgmental forecasts,” in J. S. Armstrong (ed.) Principles of Forecasting. Norwell, MA: Kluwer Academic Publishers.

    Google Scholar 

  • Taylor, F. W. (1911), Principles of Scientific Management. New York: Harper and Row.

    Google Scholar 

  • Wallace, H. A. (1923), “What is in the corn judge’s mind?” Journal of the American Society of Agronomy, 15 (7), 300–304.

    Article  Google Scholar 

  • Werner, P. D., T. L. Rose, J. A. Yesavage and K. Seeman (1984), “Psychiatrists’ judgments of dangerousness in patients on an acute care unit,” American Journal of Psychiatry, 141, No. 2, 263–266.

    Google Scholar 

  • Wiggins, N. and P. J. Hoffman (1968), “Three models of clinical judgment,” Journal of Abnormal Psychology, 73, 70–77.

    Article  Google Scholar 

  • Wiggins, N. and E. Kohen (1971), “Man vs. model of man revisited: The forecasting of graduate school success,” Journal of Personality and Social Psychology, 19, 100–106.

    Article  Google Scholar 

  • Wittink, D. R. and T. Bergestuen (2001), “Forecasting with conjoint analysis,” in J. S. Arm- strong (ed.) Principles of Forecasting. Norwell, MA: Kluwer Academic Publishers.

    Google Scholar 

  • Yntema, D. B. and W. S. Torgerson (1961), “Man-computer cooperation in decisions requiring common sense,” IRE Transactions of the Professional Group on Human Factors in Electronic. Reprinted in W. Edwards and A. Tversky (eds.) (1967), Decision Making. Baltimore: Penguin Books, pp. 300–314.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2001 Springer Science+Business Media New York

About this chapter

Cite this chapter

Armstrong, J.S. (2001). Judgmental Bootstrapping: Inferring Experts’ Rules for Forecasting. In: Armstrong, J.S. (eds) Principles of Forecasting. International Series in Operations Research & Management Science, vol 30. Springer, Boston, MA. https://doi.org/10.1007/978-0-306-47630-3_9

Download citation

  • DOI: https://doi.org/10.1007/978-0-306-47630-3_9

  • Publisher Name: Springer, Boston, MA

  • Print ISBN: 978-0-7923-7401-5

  • Online ISBN: 978-0-306-47630-3

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics