Advertisement

Ensembles of Ensembles: Combining the Predictions from Multiple Machine Learning Methods

  • David J. LieskeEmail author
  • Moritz S. Schmid
  • Matthew Mahoney
Chapter

Abstract

The rapid growth of machine learning (ML) has resulted in an almost overwhelmingly large number of modelling techniques, demanding better elucidation of their strengths and weaknesses in applied contexts. Tree-based methods such as Random Forests (RF) and Boosted Regression Trees (BRT) are powerful ML approaches that make no assumptions about the functional forms of the relationship with predictors, are flexible in handling missing data, and can easily capture complex, non-linear interactions. As with many ML methods, however, RF and BRT are potentially vulnerable to overfitting and a subsequent loss of generalizability.

The combination of predictions from multiple modelling methods, often referred to as the generation of “ensemble” or “consensus” predictions, is well established in fields such as meteorology. Recent ecological research suggests that combining the predictions from multiple methods is straightforward to implement, and can result in higher predictive accuracy than the results from single algorithms alone.

Using an example dataset involving satellite-derived fishing-vessel traffic information, we iteratively constructed 500 RF, BRT and ensemble (ENS) models and assessed the resulting mean-square prediction error (MSE) using cross-validated testing data. Performance depended upon the range of “parameter space”, with RF, BRT, and ENS approaches performing best under certain conditions. In general, variation in the number of trees used to train RF models seemed unimportant, but for this particular dataset, cross-validated error was lowest when tree depth was set to low values. BRT models were most sensitive to the size of the predictor subset used to train the model when faster “burn in” periods were selected (i.e., a higher shrinkage value). At slower “burn in” periods (i.e., shrinkage of 0.001) BRT tended to outperform both RF and ENS models except at the lowest values of predictor subset size. ENS models tended to exhibit lower prediction error with less variance under most conditions, and for faster “burn in” settings, yielded the lowest prediction errors of all.

We discuss how an approach like “committee averaging” can be used to determine a combined prediction from multiple methods, potentially improving predictive accuracy while also allowing a wider range of potentially useful algorithms to be employed.

Keywords

Ensemble models Fishing traffic Model comparison Boosted regression trees Random forests Committee averaging 

References

  1. Araújo MB, New M (2007) Ensemble forecasting of species distributions. Trends Ecol Evol 22:42–47CrossRefGoogle Scholar
  2. Baltensperger A, Huettmann F (2015) Predictive spatial niche and biodiversity hotspot models for small mammal communities in Alaska: applying machine-learning to conservation planning. Landsc Ecol 30:681–697CrossRefGoogle Scholar
  3. Brawn C (2016) Marine traffic risk density in Atlantic Canada. Unpubl. ReportGoogle Scholar
  4. Breiman L (2001) Random forests. Mach Learn 45:5–32CrossRefGoogle Scholar
  5. Breiman L (2002) Manual on setting up, using, and understanding random forests v3.1. https://www.stat.berkeley.edu/~breiman/Using_random_forests_V3.1.pdf. Accessed 31 July 2015
  6. Caruana R, Niculescu-Mizil A (2006) Proceedings of the 23rd international conference on machine learning, Pittsburgh, PAGoogle Scholar
  7. Clemen RT (1989) Combining forecasts: a review and annotated bibliography. Int J Forecast 5:559–583CrossRefGoogle Scholar
  8. Cutler DR, Edwards TC Jr, Beard JH, Cutler A, Hess KT, Gibson J, Lawler JL (2007) Random forests for classification in ecology. Ecology 88:2783–2792CrossRefGoogle Scholar
  9. Das SK, Chen S, Deasy JO, Zhou S, Yin F-F, Marks LB (2008) Combining multiple models to generate consensus: application to radition-induced pneumonitis prediction. Med Phys 35:5098–5109CrossRefGoogle Scholar
  10. De’ath G, Fabricius KE (2000) Classification and regression trees: a powerful yet simple technique for ecological data analysis. Ecology 81:3178–3192CrossRefGoogle Scholar
  11. Domingos P (2012) A few useful things to know about machine learning. Commun ACM 59:78–87CrossRefGoogle Scholar
  12. Elith J, Burgman M (2002) Predictions and their validation: rare plants in the central highlands, Victoria, Australia. In: Scott JM, Heglund PJ, Morrison ML, Raphael MG, Wall WA, Samson FB (eds) Predicting species occurrences: issues of accuracy and scale. Island Press, Covelo, pp 303–314Google Scholar
  13. Elith J, Graham CH (2009) Do they? How do they? WHY do they differ? On finding reasons for differing performances of species distribution models. Ecography 32:66–77CrossRefGoogle Scholar
  14. Elith J, Graham CH, Anderson RP, Dudik M, Ferrier S, Guisan A, Hijmans RJ, Huettmann F, Leathwick JR, Lehmann A, Li J, Lohmann LG, Loiselle RA, Manion G, Moritz C, Nakamura M, Nakazawa Y, Overton JM, Peterson AT, Phillips SJ, Richardson K, Scachetti-Pereira R, Schapiro RE, Soberón J, Williams S, Wisz MS, Zimmermann NE (2006) Novel methods improve prediction of species’ distributions from occurrence data. Ecography 29:129–151CrossRefGoogle Scholar
  15. Elith J, Leathwick JR, Hastie T (2008) A working guide to boosted regression trees. J Anim Ecol 77:802–813CrossRefGoogle Scholar
  16. Fernández-Delgado M, Cernadas E, Barro S, Amorim D (2014) Do we need hundreds of classifiers to solve real-world classification problems? J Mach Learn Res 15:3133–3181Google Scholar
  17. Franklin J (1995) Predictive vegetation mapping: geographic modelling of biospatial patterns in relation to environmental gradients. Prog Phys Geogr 19:474–499CrossRefGoogle Scholar
  18. Friedman JH (1991) Multivariate adaptive regression splines. Ann Stat 19:1–67CrossRefGoogle Scholar
  19. Friedman JH (2001) Greedy function approximation: a gradient boosting machine. Ann Stat 29:1189–1232CrossRefGoogle Scholar
  20. Harrell FE (2001) Regression modeling strategies: with applications to linear models, logistic regression, and survival analysis. Springer, New YorkCrossRefGoogle Scholar
  21. Hastie T, Tibshirani R, Friedman J (2001) The elements of statistical learning: data mining, inference, and prediction. Springer, New YorkCrossRefGoogle Scholar
  22. Hegel TM, Cushman SA, Evans J, Huettmann F (2010) Current state of the art for statistical modelling of species distributions. In: Cushman SA, Huettmann F (eds) Spatial complexity, informatics, and wildlife conservation. Springer, New York, pp 273–312CrossRefGoogle Scholar
  23. Heikkinen RK, Luoto M, Araújo MB, Virkkala R, Thuiller W, Sykes MT (2006) Methods and uncertainties in bioclimatic envelope modelling under climate change. Prog Phys Geogr 30:751–777CrossRefGoogle Scholar
  24. Hochachka WM, Caruana R, Fink D, Munson A, Riedewald M, Sorokina D, Kelling S (2007) Data-mining discovery of pattern and process in ecological systems. J Wildl Manag 71:2427–2437CrossRefGoogle Scholar
  25. Huettmann F, Artukhin Y, Gilg O, Humphries G (2011) Predictions of 27 Arctic pelagic seabird distributions using public environmental variables, assessed with colony data: a first digital IPY and GBIF open access synthesis platform. Mar Biodivers 41:141–179CrossRefGoogle Scholar
  26. Huettmann F, Schmid M (2015) Climate change predictions of pelagic biodiversity components. In: De Broyer C, Koubbi P, Griffiths HJ, Raymond B, Udekem d’Acoz C, Van de Putte AP, Danis B, David B, Grant S, Gutt J, Held C, Hosie G, Huettmann F, Post A, Ropert-Coudert Y (eds) Biogeographic Atlas of the Southern Ocean. Scientific Committee on Antarctic Research, Cambridge, pp 390–396Google Scholar
  27. James G, Witten D, Hastie T, Tibshirani R (2013) An introduction to statistical learning: with applications in R. Springer, New YorkCrossRefGoogle Scholar
  28. Jiao S, Guo Y, Huettmann F, Lei G (2014) Nest-site selection analysis of Hooded Crane (Grus monacha) in northeastern China based on a multivariate ensemble model. Zool Sci 31:430–437CrossRefGoogle Scholar
  29. Leathwick JR, Elith J, Francis MP, Hastie T, Taylor P (2006) Variation in demersal fish species richness in the oceans surrounding New Zealand: an analysis using boosted regression trees. Mar Ecol Prog Ser 321:267–281CrossRefGoogle Scholar
  30. Liaw A, Wiener M (2002) Classification and regression by Random forest. R News 2(3):18–22Google Scholar
  31. Lieske DJ, Fifield DA, Gjerdrum C (2014) Maps, models, and marine vulnerability: assessing the community distribution of seabirds at-sea. Biol Conserv 172:15–28CrossRefGoogle Scholar
  32. Lieske DJ, Schmid M, Mahoney M (2018) Data and analysis script. https://doi.org/10.5281/zenodo.1318352
  33. Marmion M, Luoto M, Heikkinen RK, Thuiller W (2009a) The performance of state-of-the-art modelling techniques depends on geographical distribution of species. Ecol Model 220:3512–3520CrossRefGoogle Scholar
  34. Marmion M, Parviainen M, Luoto M, Heikkinen RK, Thuiller W (2009b) Evaluation of concensus methods in predictive species distribution modelling. Divers Distrib 15:59–69CrossRefGoogle Scholar
  35. Mi C, Huettmann F, Guo Y (2014) Obtaining the best possible predictions of habitat selection for wintering Great Bustards in Cangzhou, Hebei Province with rapid machine learning analysis. Chin Sci Bull. Published online: https://doi.org/10.1007/s11434-014-0445-9 CrossRefGoogle Scholar
  36. Moisen GG, Freeman EA, Blackard JA, Frescino TS, Zimmermann NE, Edwards TC (2006) Predicting tree species presence and basal area in Utah: a comparison of stochastic gradient boosting, generalized additive models, and tree-based methods. Ecol Model 199:176–187CrossRefGoogle Scholar
  37. Olden JD, Lawler JJ, Poff NL (2008) Machine learning methods without tears: a primer for ecologists. Q Rev Biol 83:171–193CrossRefGoogle Scholar
  38. Oppel S, Meirinho A, Ramírez I, Gardner B, O'Connell AF, Miller PI, Louzao M (2012) Comparison of five modelling techniques to predict the spatial distribution and abundance of seabirds. Biol Conserv 156:94–104CrossRefGoogle Scholar
  39. Peters J, De Baets B, Verhoest MEC, Samson R, Degroeve S, De Becker P, Huybrechts W (2007) Random forests as a tool for ecohydrological distribution modelling. Ecol Model 207:304–318CrossRefGoogle Scholar
  40. Pinkerton MH, Smith ANH, Raymond B, Hosie GW, Sharp B, Leathwick JR, Bradford-Grieve JM (2010) Spatial and seasonal distribution of adult Oithona similis in the southern ocean: predictions using boosted regression trees. Deep-Sea Res I 57:469–485CrossRefGoogle Scholar
  41. Prasad AM, Iverson LR, Liaw A (2006) Newer classification and regression tree techniques: bagging and random forests for ecological prediction. Ecosystems 9:181–199CrossRefGoogle Scholar
  42. Renner M, Parrish JK, Piatt JF, Kuletz KJ, Edwards AE, Hunt GL Jr (2013) Modeled distribution and abundance of a pelagic seabird reveal trends in relation to fisheries. Mar Ecol Prog Ser 484:259–277CrossRefGoogle Scholar
  43. Ridgeway G (2012) Generalized boosted models: a guide to the gbm package. URL: http://gradientboostedmodels.googlecode.com/git/gbm/inst/doc/gbm.pdf (downloaded January 8, 2016)
  44. Salford Systems, Inc. (2016) https://www.salford-systems.com/
  45. Schmid MS, Aubry C, Grigor J, Fortier L (2016) The LOKI underwater imaging system and an automatic identification model for the detection of zooplankton taxa in the Arctic Ocean. Methods Oceanogr In presGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • David J. Lieske
    • 1
    Email author
  • Moritz S. Schmid
    • 2
  • Matthew Mahoney
    • 1
  1. 1.Department of Geography and EnvironmentMount Allison UniversitySackvilleCanada
  2. 2.Hatfield Marine Science CenterOregon State UniversityNewportUSA

Personalised recommendations