Statinfer

203.7.7 Boosting

Concepts behind Boosting.

In previous section, we studied about Random Forest

In this post we will cover how boosting work and the type of boosting algorithms.

Boosting

  • Boosting is one more famous ensemble method.
  • Boosting uses a slightly different techniques to that of bagging.
  • Boosting is a well proven theory that works really well on many of the machine learning problems like speech recognition.
  • If bagging is wisdom of crowds then boosting is wisdom of crowds where each individual is given some weight based on their expertise.
  • Boosting in general decreases the bias error and builds strong predictive models.
  • Boosting is an iterative technique. We adjust the weight of the observation based on the previous classification.
  • If an observation was classified incorrectly, it tries to increase the weight of this observation and vice versa.

Boosting Main idea

Final Classifier \(C = \sum \alpha_i c_i\)

How weighted samples are taken

Boosting Illustration

Below is the training data and their classes .We need to take a note of record numbers, they will help us in weighted sampling later.

Theory behind Boosting Algorithm

  • Take the dataset
  • Build a classifier \(C_m\) and find the error
  • Calculate error rate of the classifier
    • Error rate of \(\epsilon _m = \sum w_i I (y_i \neq C_m (x)) / \sum w_i\) = Sum of misclassification weight / sum of sample weights
  • Calculate an intermediate factor called a. It analogous to accuracy rate of the model. It will be later used in weight updating. It is derived from error
    • \(\alpha _m = log(1- \epsilon _m)/\epsilon _m)\)
  • Update weights of each record in the sample using the a factor. The indicator function will make sure that the misclassifications are given more weight
    • For i =1,2,… N
      • \(W_(i+1) = w_i e^(\alpha _m I(y_i\neq C_m (x)))\)
      • Renormalize so that sum of weights is 1
  • Repeat this model building and weight update process until we have no misclassification
  • Final collation is done by voting from all the modes. While taking the votes, each model is weighted by the accuracy factor \(\alpha\)
    • \(C = sign(\sum \alpha _i C_i(x))\)

Gradient Boosting

  • Ada boosting
    • Adaptive Boosting
    • Till now we discussed Ada boosting technique. Here we give high weight to misclassified records.
  • Gradient Boosting
    • Similar to Ada boosting algorithm.
    • The approach is same but there are slight modifications during re-weighted sampling.
    • We update the weights based on misclassification rate and gradient
    • Gradient boosting serves better for some class of problems like regression.

The next post is a Practice Session on Boosting.

0 responses on "203.7.7 Boosting"

Leave a Message

Blog Posts

Hurry up!!!

"use coupon code for FLAT 30% discount"  datascientistoffer        ___________________________________      Subscribe to our youtube channel. Get access to video tutorials.                

Contact Us

Statinfer Software Solutions#647 2nd floor 1st Main, Indira Nagar 1st Stage, 100 feet road,Indranagar Bangalore,Karnataka, Pin code:-560038 Landmarks: Opp. Namma Metro Pillar 48.

Connect with us

linkin fn twitter g

How to become a Data Scientist.?

top