Skip to main content

Gradient Boosting Machine

  • Chapter
  • First Online:
Pro Machine Learning Algorithms

Abstract

So far, we’ve considered decision trees and random forest algorithms. We saw that random forest is a bagging (bootstrap aggregating) algorithm—it combines the output of multiple decision trees to give the prediction. Typically, in a bagging algorithm trees are grown in parallel to get the average prediction across all trees, where each tree is built on a sample of original data.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 69.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2018 V Kishore Ayyadevara

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Ayyadevara, V.K. (2018). Gradient Boosting Machine. In: Pro Machine Learning Algorithms . Apress, Berkeley, CA. https://doi.org/10.1007/978-1-4842-3564-5_6

Download citation

Publish with us

Policies and ethics