I'm using Adaboost and here is a question about weak learners. In the Adaboost algorithm, as follows, in step (2), can I use different algorithms? For example, when k is 1, I use KNN, if k=2, SVM is used and for k=3, I use decision tree? Or, should I use a single algorithm in all k iteration of the for loop?
(1) initialize the weight of each tuple in D to 1=d;
(2) for i = 1 to k do // for each round:
(3) sample D with replacement according to the tuple weights to obtain Di ;
(4) use training set Di to derive a model, Mi ;
(5) compute error.Mi/, the error rate of Mi (Eq. 8.34)
(6) if error.Mi/ > 0.5 then
(7) go back to step 3 and try again;
(8) endif
(9) for each tuple in Di that was correctly classified do
(10) multiply the weight of the tuple by error.Mi/=.1error.Mi//; // update weights
(11) normalize the weight of each tuple;
(12) endfor