Expert Systems for Forecasting

  • Fred Collopy
  • Monica Adya
  • J. Scott Armstrong
Part of the International Series in Operations Research & Management Science book series (ISOR, volume 30)

Abstract

Expert systems use rules to represent experts’ reasoning in solving problems. The rules are based on knowledge about methods and the problem domain. To acquire knowledge for an expert system, one should rely on a variety of sources, such as textbooks, research papers, interviews, surveys, and protocol analyses. Protocol analyses are especially useful if the area to be modeled is complex or if experts lack an awareness of their processes. Expert systems should be easy to use, incorporate the best available knowledge, and reveal the reasoning behind the recommendations they make. In forecasting, the most promising applications of expert systems are to replace unaided judgment in cases requiring many forecasts, to model complex problems where data on the dependent variable are of poor quality, and to handle semi-structured problems. We found 15 comparisons of forecast validity involving expert systems. As expected, expert systems were more accurate than unaided judgment, six comparisons to one, with one tie. Expert systems were less accurate than judgmental bootstrapping in two comparisons with two ties. There was little evidence with which to compare expert systems and econometric models; expert systems were better in one study and tied in two.

Keywords

Inductive techniques judgmental bootstrapping knowledge acquisition production systems protocol analysis retrospective process tracing 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Abramson, B., J. Brown, W. Edwards, A. MurphyR. L. Winkler (1996), “Hailfinder: A Bayesian system for forecasting severe weather,” International Journal ofForecasting, 12, 57–71.CrossRefGoogle Scholar
  2. Allen, P. G.R. Fildes (2001), “Econometric forecasting,” in J. S. Armstrong (ed.), Principles of Forecasting. Norwell, MA: Kluwer Academic Publishers.Google Scholar
  3. Armstrong, J. S. (2001a), `Evaluating forecasting methods,“ in J. S. Armstrong (ed.), Principles of Forecasting. Norwell, MA: Kluwer Academic Publishers.Google Scholar
  4. Armstrong, J. S. (2001b), “Judgmental bootstrapping: Inferring experts’ rules for forecasting,” in J. S. Armstrong (ed.), Principles of Forecasting. Norwell, MA: Kluwer Academic Publishers.Google Scholar
  5. Armstrong, J. S., M. AdyaF. Collopy (2001), “Rule-based forecasting: Using judgment in time-series extrapolation,” in J. S. Armstrong (ed.), Principles of Forecasting. Norwell, MA: Kluwer Academic Publishers.Google Scholar
  6. Armstrong, J. S.J. T. Yokum (2001), “Potential diffusion of expert systems in forecasting,” Technological Forecasting and Social Change (forthcoming).Google Scholar
  7. Ashouri, F. (1993), “An expert system for predicting gas demand: A case study,” Omega, 21, 307–317.CrossRefGoogle Scholar
  8. Chapman, L. J.J. P. Chapman (1969), “Illusory correlations as an obstacle to the use of valid psychodiagnostic observations,” Journal of Abnormal Psychology, 74, 271–280.CrossRefGoogle Scholar
  9. Clarkson, G. P. E. (1962), Portfolio Selection. Englewood Cliffs, NJ: Prentice-Hall.Google Scholar
  10. Cocozza, J. J.H. J. Steadman (1978), “Prediction in psychiatry: An example of mis-placed confidence in experts,” Social Problems, 25, 265–276.CrossRefGoogle Scholar
  11. Collopy, F.J. S. Armstrong (1989), “Toward computer-aided forecasting systems,” in G. R. Widemeyer (ed.), DSS Transactions. Providence, R.I.: TIMS College on Infor-mation Systems, 103–119. Full text at hops.wharton.upenn.edulforecast.Google Scholar
  12. Dijkstra, J. J. (1995), “The influence of an expert system on the user’s view: How to fool a lawyer,” New Review of Applied Expert Systems, 1, 123–138.Google Scholar
  13. Dijkstra, J. J., W. B. G. LiebrandE. Timminga (1998), “Persuasiveness of expert systems,” Behavior Information Technology, 17, 3, 155–163.CrossRefGoogle Scholar
  14. Doukidis, G. I.R. J. Paul (1990), “A survey of the application of artificial intelligence techniques within the OR Society,” Journal of the Operational Research Society, 41, 363–375.Google Scholar
  15. Einhorn, H. J., D. N. KleinmuntzB. Kleinmuntz (1979), “Linear regression and process-tracing models of judgment,” Psychological Review, 86, 465–485.CrossRefGoogle Scholar
  16. Eom, S. E. (1996), “A survey of operational expert systems in business,” Interfaces, 26, (September-October) 50–70.CrossRefGoogle Scholar
  17. Fischhoff, B., P. SlovicS. Lichtenstein (1978), “Fault trees: Sensitivity of estimated failure probabilities to problem representation,” Journal of Experimental Psychology, 4, 330–344.Google Scholar
  18. Gill, T. G. (1995), “Early expert systems: Where are they now?” MIS Quarterly, March, 51–81.Google Scholar
  19. Grove, W. M.P. E. Meehl (1996), “Comparative efficiency of informal (subjective, impressionistic) and formal (mechanical, algorithmic) prediction procedures: The clinical-statistical controversy,” Psychology, Public Policy, and Law, 2, 293–323.Google Scholar
  20. Guimaraes, T., Y. YoonA. Clevenson (1996), “Factors important to expert system success: A field test,” Information Management, 30, 119–130.CrossRefGoogle Scholar
  21. Kleinmuntz, B. (1967), “Sign and seer: Another example,” Journal of Abnormal Psychology, 72, 163–165.CrossRefGoogle Scholar
  22. Kleinmuntz, B. (1968), “The processing of clinical information by man and machine,” in B. Kleinmuntz (ed.), Formal Representation of Human Judgments. New York: John Wiley.Google Scholar
  23. Kort, F. (1957), “Predicting Supreme Court decisions mathematically: A quantitative analysis of the `right to counsel’ cases,” The American Political Science Review, 51, 1–12.CrossRefGoogle Scholar
  24. Larcker, D. F.V. P. Lessig (1983), “An examination of the linear and retrospective pro-cess tracing approaches to judgment modeling,” Accounting Review, 58, 58–77.Google Scholar
  25. Leonard, K. J. (1995), “The development of a rule based expert system model for fraud alert in consumer credit,” European Journal of Operational Research, 80, 350–356.CrossRefGoogle Scholar
  26. McClain, J. O. (1972), “Decision modeling in case selection for medical utilization re-view,” Management Science, 18, B706 - B717.CrossRefGoogle Scholar
  27. Michael, G. C. (1971), “A computer simulation model for forecasting catalogue sales,” Journal of Marketing Research, 8, 224–229.CrossRefGoogle Scholar
  28. Moninger, W. R., J. Bullas, B. de Lorenzis, E. Ellison, J. Flueck, J. C. McLeod, C. Lusk, P. D. Lampru, R. S. Phillips, W. F. Roberts, R. Shaw, T. R. Stewart, J. Weaver, K. C. YoungS. M. Zubrick (1991), “Shootout-89: A comparative evaluation of knowledge-based systems that forecast severe weather,” Bulletin of the American Meteorological Society, 72, 9, 1339–1354.Google Scholar
  29. Moore, J. S. (1998), “An expert systems approach to graduate school admission decisions and academic performance prediction,” Omega, 26, 670–695.CrossRefGoogle Scholar
  30. Moss, S., M. ArtisP. Ormerod (1994), “A smart automated macroeconometric forecasting system,” Journal of Forecasting, 13, 299–312.CrossRefGoogle Scholar
  31. Nute, D., R. I. MannB. F. Brewer (1990), “Controlling expert system recommendations with defeasible logic,” Decision Support Systems, 6, 153–164.CrossRefGoogle Scholar
  32. Reagan-Cirincione, P. (1994), “Improving the accuracy of group judgment: A process intervention combining group facilitation, social judgment analysis, and information technology,” Organizational Behavior and Human Performance, 58, 246–270.CrossRefGoogle Scholar
  33. Santhanam, R.J. Elam (1998), “A survey of knowledge-based system research in decision sciences (1980–1995),” Journal of the Operational Research Society, 49, 445457.Google Scholar
  34. Schmitt, N. (1978), “Comparison of subjective and objective weighting strategies in changing task situations,” Organizational Behavior and Human Performance, 21, 171–188.CrossRefGoogle Scholar
  35. Silverman, B. G. (1992), “Judgment error and expert critics in forecasting tasks,” Decision Sciences, 23, 1199–1219.CrossRefGoogle Scholar
  36. Smith, P., S. HusseinD. T. Leonard (1996), “Forecasting short-term regional gas de- mand using an expert system,” Expert Systems with Applications, 10, 265–273.CrossRefGoogle Scholar
  37. Stewart, T. R., W. R. Moninger, J. Grassia, R. H. Brady F. H. Menem (1989), “Analysis of expert judgment in a hail forecasting experiment,” Weather and Forecasting, 4, 2434.CrossRefGoogle Scholar
  38. Tellis, G. (1988), “The price elasticity of selective demand: A meta-analysis of econometric models of sales,” Journal of Marketing Research, 25, 331–341.CrossRefGoogle Scholar
  39. Turing, A. M. (1950), “Computing machinery and intelligence,” Mind, 59, 443–460. Weitz, R. R. (1986), “NOSTRADAMUS: A knowledge based forecasting advisor,” International Journal of Forecasting, 2, 273–283.Google Scholar
  40. Wittink D. R.T. Bergestuen (2001), “Forecasting with conjoint analysis,” in J. S. Armstrong (ed.), Principles of Forecasting. Norwell, MA: Kluwer Academic Publishers.Google Scholar
  41. Wong, B. K. J. A. Monaco (1995), “Expert system applications in business: A review and analysis of the literature (1977–1993),” Information and Management, 29, 14 1152.Google Scholar
  42. Yntema, D. B. W. S. Torgerson (1961), “Man-computer cooperation in decisions requiring common sense,” in W. Edwards A. Tversky (eds.), Decision Making. Baltimore: Pengu in Books.Google Scholar

Copyright information

© Springer Science+Business Media New York 2001

Authors and Affiliations

  • Fred Collopy
    • 1
  • Monica Adya
    • 2
  • J. Scott Armstrong
    • 3
  1. 1.The Weatherhead School of ManagementCase Western Reserve UniversityUSA
  2. 2.Department of ManagementDe Paul UniversityUSA
  3. 3.The Wharton SchoolUniversity of PennsylvaniaUSA

Personalised recommendations