0

When training a model usually the model is trained with a loss function and an accuracy metric.

Is there a negative effect by using a second loss function as an accuracy metric? (ex: mean_absolute_error)

Stormsson
  • 1,391
  • 2
  • 16
  • 29
  • Loss and metrics are somewhat similar concepts but not exchangeable. The loss is a number that becomes smaller the closer your model is to reality. Metrics are numbers that tell you specific information about how good your model is. In a regression problem, MAE/MSE/RMSE are all possibly valid as loss and metric, but in a classification problem, while you can use e.g. MSE as loss (even if is uncommon) you shouldn't use it as metric since it doesn't really tell you what to expect from the model. – jdehesa May 29 '18 at 09:25
  • 2
    In short, "loss" is the only thing that affects your model. Metrics are extra stuff just to let you informed. – Daniel Möller May 29 '18 at 11:31
  • Possibly useful: https://stackoverflow.com/questions/47817424/loss-accuracy-are-these-reasonable-learning-curves/47819022#47819022 – desertnaut May 29 '18 at 12:16

1 Answers1

2

By using a second loss function as an accuracy metric the network will be trained the same way as before (since you keep your loss function). The only thing that changes is your accuracy metric, which gives you an idea of how good the network performs. Thus your accuracy metric might show no improvements what so ever, just because the network does not know about the error you're looking at.

Generally speaking it is always a good idea to have a look on multiple accuracy metrics, since you get a more complete idea of what your network is learning or not learning. Though always keep in mind which loss function you actually deploy for Training your network.

Another scenario could be, that you want to keep track of accuracy metrics which are not differentiable, e.g. the IoU of bounding boxes. You would then use this as a second accuracy metric, while using a differentiable loss function.

mrk
  • 8,059
  • 3
  • 56
  • 78