AdaBoost is a meta machine learning algorithm. It performs several rounds of training in which the best weak classifiers are selected. At the end of each round, the still misclassified training samples are given a higher weight, resulting in more focus on these samples during the next round of selecting a weak classifier.
Questions tagged [adaboost]
255 questions
32
votes
2 answers
Using GridSearchCV with AdaBoost and DecisionTreeClassifier
I am attempting to tune an AdaBoost Classifier ("ABT") using a DecisionTreeClassifier ("DTC") as the base_estimator. I would like to tune both ABT and DTC parameters simultaneously, but am not sure how to accomplish this - pipeline shouldn't work,…

GPB
- 2,395
- 8
- 26
- 36
19
votes
3 answers
AdaBoostClassifier with different base learners
I am trying to use AdaBoostClassifier with a base learner other than DecisionTree. I have tried SVM and KNeighborsClassifier but I get errors. What are the classifiers that can be used with AdaBoostClassifier?

vdesai
- 813
- 1
- 6
- 15
17
votes
3 answers
How to boost a Keras based neural network using AdaBoost?
Assuming I fit the following neural network for a binary classification problem:
model = Sequential()
model.add(Dense(21, input_dim=19, init='uniform', activation='relu'))
model.add(Dense(80, init='uniform', activation='relu'))
model.add(Dense(80,…

ishido
- 4,065
- 9
- 32
- 42
15
votes
2 answers
Weak Classifier
I am trying to implement an application that uses AdaBoost algorithm. I know that AdaBoost uses set of weak classifiers, but I don't know what these weak classifiers are. Can you explain it to me with an example and tell me if I have to create my…

gadzix90
- 744
- 2
- 13
- 28
14
votes
2 answers
how to use weight when training a weak learner for adaboost
The following is adaboost algorithm:
It mentions "using weights wi on the training data" at part 3.1.
I am not very clear about how to use the weights. Should I resample the training data?

tidy
- 4,747
- 9
- 49
- 89
13
votes
3 answers
Explaining the AdaBoost Algorithms to non-technical people
I've been trying to understand the AdaBoost algorithm without much success. I'm struggling with understanding the Viola Jones paper on Face Detection as an example.
Can you explain AdaBoost in laymen's terms and present good examples of when it's…

dole doug
- 34,070
- 20
- 68
- 87
12
votes
4 answers
Using adaboost within R's caret package
I've been using the ada R package for a while, and more recently, caret. According to the documentation, caret's train() function should have an option that uses ada. But, caret is puking at me when I use the same syntax that sits within my ada()…

Bryan
- 5,999
- 9
- 29
- 50
10
votes
1 answer
Custom learner function for Adaboost
I am using Adaboost to fit a classification problem. We can do the following:
ens = fitensemble(X, Y, 'AdaBoostM1', 100, 'Tree')
Now 'Tree' is the learner and we can change this to 'Discriminant' or 'KNN'. Each learner uses a certain Template…

WJA
- 6,676
- 16
- 85
- 152
9
votes
1 answer
Parameter selection in Adaboost
After using OpenCV for boosting I'm trying to implement my own version of the Adaboost algorithm (check here, here and the original paper for some references).
By reading all the material I've came up with some questions regarding the implementation…

Matteo
- 7,924
- 24
- 84
- 129
9
votes
4 answers
How to calculate alpha if error rate is zero (Adaboost)
I have been wondering what the value of alpha (weight of a weak classifier) should be when it has an error rate(perfect classification) since the algorithm for alpha is
(0.5) * Math.log(((1 - errorRate) / errorRate))
Thank you.

James Euangel E. Limpiado
- 121
- 1
- 4
9
votes
1 answer
Advantages of SVM over decion trees and AdaBoost algorithm
I am working on binary classification of data and I want to know the advantages and disadvantages of using Support vector machine over decision trees and Adaptive Boosting algorithms.

Akshay Kekre
- 254
- 2
- 10
8
votes
3 answers
Multilabel AdaBoost for MATLAB
I am currently looking for a multilabel AdaBoost implementation for MATLAB or a technique for efficiently using a two-label implementation for the multilabel case. Any help in that matter would be appreciated.

smichak
- 4,716
- 3
- 35
- 47
8
votes
1 answer
How to normalize an image color?
In their paper describing Viola-Jones object detection framework (Robust Real-Time Face Detection by Viola and Jones), it is said:
All example sub-windows used for training were variance normalized to
minimize the effect of different lighting…

Koji Ikehara
- 117
- 2
- 9
6
votes
5 answers
state-of-the-art of classification algorithms
We know there are like a thousand of classifiers, recently I was told that, some people say adaboost is like the out of the shell one.
Are There better algorithms (with
that voting idea)
What is the state of the art in
the classifiers.Do you…

edgarmtze
- 24,683
- 80
- 235
- 386
6
votes
1 answer
Adaboost with neural networks
I implemented Adaboost for a project, but I'm not sure if I've understood adaboost correctly. Here's what I implemented, please let me know if it is a correct interpretation.
My weak classifiers are 8 different neural networks. Each of these…

sanjeev mk
- 4,276
- 6
- 44
- 69