0

enter image description here

I am having hard time to calculate the soft SVM solution. I can't understand how to penalty mistakes ,for example- <w,(0,0)> will be 0 therefore I will need to give 1 penalty(hinge loss). Each line that will separate the data will do really bad classification. I am trying to solve argmin ,ξ {‖‖^2+1/∑ξ} but cant think on a way to compute it. In the real question the data points where x1 = (0,0), x2 = (0,z) , x3 = (t,0), x4 = (t,z) and the labels: y1=y4 = -1 , y2=y3 = 1. I am trying to find soft SVM solution for this problem(finding w,b)

Code_m
  • 21
  • 4

0 Answers0