Skip to main content

Evaluation of Different Heuristics for Accommodating Asymmetric Loss Functions in Regression

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 10558))

Abstract

Most machine learning methods used for regression explicitly or implicitly assume a symmetric loss function. However, recently an increasing number of problem domains require loss functions that are asymmetric in the sense that the costs for over- or under-predicting the target value may differ. This paper discusses theoretical foundations of handling asymmetric loss functions, and describes and evaluates simple methods which might be used to offset the effects of asymmetric losses. While these methods are applicable to any problem where an asymmetric loss is used, our work derives its motivation from the area of predictive maintenance, which is often characterized by a small number of training samples (in case of failure prediction) or monetary cost-based, mostly non-convex, loss functions.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    The datasets used are auto93.arff, autoMpg.arff, autoPrice.arff, cloud.arff, cpu.arff, echoMonths.arff, elevators.arff, housing.arff, meta.arff, pyrim.arff, strike.arff, triazines.arff, and veteran.arff.

References

  1. Calvo, B., Santafe, G.: scmamp: Statistical comparison of multiple algorithms in multiple problems. R J. 8(1), 248–256 (2016)

    Google Scholar 

  2. Christoffersen, P.F., Diebold, F.X.: Optimal prediction under asymmetric loss. Econom. Theor. 13, 808–817 (1997)

    Article  MathSciNet  Google Scholar 

  3. Demšar, J.: Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 7, 1–30 (2006)

    MathSciNet  MATH  Google Scholar 

  4. Elkan, C.: The foundations of cost-sensitive learning. In: Proceedings of the 17th International Joint Conference on Artificial Intelligence (IJCAI 2001), vol. 2. pp. 973–978. Morgan Kaufmann Publishers Inc, San Francisco, CA, USA (2001)

    Google Scholar 

  5. Frank, E., Wang, Y., Inglis, S., Holmes, G., Witten, I.H.: Using model trees for classification. Mach. Learn. 32(1), 63–76 (1998)

    Article  MATH  Google Scholar 

  6. Granger, C.W.J.: Prediction with a generalized cost of error function. Oper. Res. Q. 20(2), 199–207 (1969)

    Article  MathSciNet  MATH  Google Scholar 

  7. Hall, M., Frank, E., Holmes, G., Pfahringer, B., Reutemann, P., Witten, I.H.: The Weka data mining software: An update. SIGKDD Explor. 11(1), 10–18 (2009)

    Article  Google Scholar 

  8. Hernández-Orallo, J.: Probabilistic reframing for cost-sensitive regression. ACM Trans. Knowl. Discov. Data 8(4), 17.1–17.55 (2014)

    Google Scholar 

  9. Léger, C., Romano, J.P.: Bootstrap choice of tuning parameters. Annal. Inst. Stat. Math. 42(4), 709–735 (1990)

    Article  MathSciNet  MATH  Google Scholar 

  10. Lichman, M.: UCI machine learning repository (2013). http://archive.ics.uci.edu/ml

  11. McCullough, B.: Optimal prediction with a general loss function. J. Comb. Inf. Syst. Sci. 25(14), 207–221 (2000)

    Google Scholar 

  12. Zhao, H., Sinha, A.P., Bansal, G.: An extended tuning method for cost-sensitive regression and forecasting. Decis. Support Syst. 51(3), 372–383 (2011)

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by the German Federal Ministry of Education and Research (BMBF) under the “An Optic’s Life” project (no. 16KIS0025). Thanks to the reviewers of this and a previous version of the paper, in particular for the pointers to prior work.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andrei Tolstikov .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Tolstikov, A., Janssen, F., Fürnkranz, J. (2017). Evaluation of Different Heuristics for Accommodating Asymmetric Loss Functions in Regression. In: Yamamoto, A., Kida, T., Uno, T., Kuboyama, T. (eds) Discovery Science. DS 2017. Lecture Notes in Computer Science(), vol 10558. Springer, Cham. https://doi.org/10.1007/978-3-319-67786-6_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-67786-6_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-67785-9

  • Online ISBN: 978-3-319-67786-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics