Boosting Algorithms in Machine Learning, Part II: Gradient Boosting | by Gurjinder Kaur | Nov, 2024


Uncovering a simple yet powerful, award-winning machine learning algorithm

Photo by Kevin Bowler on Unsplash

In this article, we will learn about gradient boosting, a machine learning algorithm that lays the foundation of popular frameworks like XGBoost and LightGBM which are award-winning solutions for several machine learning competitions.

This is a great article for anyone thinking about using ensembles in machine learning or a great refresher for someone who is already a pro but just wants to take a break from dot fits and dot predicts and wants a look under the hood!

We’ll cover the basics of ensemble learning and explain how the Gradient Boosting algorithm makes predictions with a step-by-step example. We’ll also explore the relationship between gradient descent and gradient boosting and find out if there is any connection. Let’s get started!

Ensemble learning is the process of training and combining multiple models, often weak learners, to create a strong learner with a higher predictive power. Two ways to do this are bagging and boosting.

1. Bagging

Source link

#Boosting #Algorithms #Machine #Learning #Part #Gradient #Boosting #Gurjinder #Kaur #Nov

Leave a Comment