-1

As I know we optimize our model with changing the weight parameters over the iterations. The aim is to minimize the loss and maximize the accuracy.

I don't understand why we using loss as parameter as well if we have accuracy as parameter.

Can we use only accuracy and drop loss from our model? With accuracy we can also change the model weights?

desertnaut
  • 57,590
  • 26
  • 140
  • 166
John
  • 3
  • 2
  • 1
    We optimize loss, accuracy is just a side effect. SGD (optimization mechanism behind NN) needs a differentiable function. Loss functions are differentiable, accuracy is not. – Marat Dec 25 '21 at 02:19
  • 2
    This is not a programming question, this kind of questions are not appropriate in Stack Overflow. Also this is covered in a basic machine learning course. – Dr. Snoopy Dec 25 '21 at 16:09

1 Answers1

0

In short, perfecting a neural network is all about minimizing the difference between the intended result and given result. The difference is known as the cost/loss. So the smaller the cost/loss, the closer the intended value, so the higher the accuracy

I suggest you watch 3Blue1Brown's video series on neural networks on youtube

desertnaut
  • 57,590
  • 26
  • 140
  • 166
CoderTang
  • 460
  • 5
  • 14