0

I want to plot result of libsvmtrain_ova in this link: 10 fold cross-validation in one-against-all SVM (using LibSVM)

, I used this code within libsvmtrain_ova but i think it is not working properly.

hold off
figure();
for j=1:numLabels
   w = models{j}.SVs' * models {j}.sv_coef;
   b = -models{j}.rho;
   c1 = find(double(labels==1) == 1);
   c2= find(double(labels==2) == 1);
   c3=find(double(labels==3) == 1);
   plot(X(c1,1), X(c1,2), 'ko', 'MarkerFaceColor', 'b'); hold on;
   plot(X(c2,1), X(c2,2), 'ko', 'MarkerFaceColor', 'g');hold on;
   plot(X(c3,1), X(c3,2), 'ko', 'MarkerFaceColor', 'r')
   % Plot the decision boundary
   plot_x = linspace(min(X(:,1)), max(X(:,1)), 30);
   plot_y = (-1/w(2))*(w(1)*plot_x + b);
   plot(plot_x, plot_y, 'k-', 'LineWidth', 2)
end
title(sprintf('SVM Linear Classifier with C = %g', 1), 'FontSize', 12)
Community
  • 1
  • 1
  • Can you tell us the error you get? – Zahra E Dec 29 '12 at 18:10
  • plotted support vectors is very low and i think the result is not recognize in correct schematic, did you test this code? – Maryam Bagheri Dec 29 '12 at 18:17
  • I couldn't get any result. I suggest that you edit your question and provide more information about your problem. – Zahra E Dec 29 '12 at 18:29
  • i want to plot result of this link 10 fold cross-validation in one-against-all SVM (using LibSVM) for training. – Maryam Bagheri Dec 29 '12 at 18:32
  • I myself couldn't find any solution for plotting one vs all using libsvm.[This](http://cmp.felk.cvut.cz/cmp/software/stprtool/) toolbox provides good plotting features for one vs all SVM, but it has nothing to do with LibSVM. If you don't want to use libsvm I highly recommend stprtool. – Zahra E Dec 29 '12 at 18:48
  • no, i want to use libsvm, i could plot the result with this code within libsvmtrain_ova but the result is not in correct format that i want, please copy this code within the libsvmtrain_ova before this line: mdl = struct('models',{models}, 'labels',labels); and check plotted result – Maryam Bagheri Dec 29 '12 at 18:53

1 Answers1

0

Your code isn't close to working, there seem to be many conceptual issues you seem to be missing. I'm going to assume that you understand that:

  • The iris data (see linked question) is 4-dimensional. A linear classifier in this space is a hyperplane in 4 dimensions and you can't plot a 4-dimensional function in a 2-d plane.
  • The result of plotting a one-vs-all classifier for 3 classes are three hyperplanes
  • It makes no sense to plot the result of 10 fold cross validation per se as there is no one plot-able result, you can plot every intermediate result but you're far away being able to accomplish that.

Still I think there is a real question here. I'm going to take two dimensions of the iris data and plot the separating hyperplanes (lines, in this case). When you have the linked code, all you need to do is the following:

  • Select two dimensions, in my case I selected dimensions 3 and 4 of the iris data.
  • Split the data into two, one part to train and one part to test.
  • Do a little bit of math and plot the points and lines.

This is the code:

S = load('fisheriris');
data = zscore(S.meas);
data = data(:,3:4);
labels = grp2idx(S.species);

opts = '-s 0 -t 2 -c 1 -g 0.25';    %# libsvm training options

indices = crossvalidation(labels, 2); 
testIdx = (indices == 1); trainIdx = ~testIdx;
mdl = libsvmtrain_ova(labels(trainIdx), data(trainIdx,:), opts);
figure(1);
numlabels = numel(unique(labels));
testlabels = labels(testIdx);
testdata = data(testIdx,:);
style = {'b+','r+','g+'};
stylel = {'b-','r-','g-'};
for i=1:numlabels,
    plot(testdata(find(testlabels==i),1),testdata(find(testlabels==i),2),style{i});
    hold on;
    w = mdl.models{i}.SVs' * mdl.models{i}.sv_coef;
    b = -mdl.models{i}.rho;
    x = -2:0.1:2
    y = -(w(1)/w(2))*x - (b/w(2));
    plot(x,y,stylel{i});
end

grid on;
hold off;

and this is the plot:

enter image description here

each colored line should split points of said color from all other colors. Observe that the lines were obtained by training and the points come from testing data on which we did not train.

Community
  • 1
  • 1
carlosdc
  • 12,022
  • 4
  • 45
  • 62