Gradient Boosting
Last updated
Was this helpful?
Last updated
Was this helpful?
Gradient boosting (derived from the term gradient boosting machines) is a popular supervised machine learning technique for regression and classification problems that aggregates an ensemble of weak individual models to obtain a more accurate final model.
Gradient boosting is a unique ensemble method since it involves identifying the shortcomings of weak models and incrementally or sequentially building a final ensemble model using a that is optimized with . Decision trees are typically the weak learners in gradient boosting and consequently, the technique is sometimes referred to as gradient tree boosting.
is a very popular gradient boosting framework that is fast, uses some clever tricks to obtain more accurate results, and is easy to parallelize.