So, I know that the error rate of a hypothesis is the proportion of times that h(x) ≠ y for an (x, y), but I'm not sure why and how it compares to loss.
Asked
Active
Viewed 693 times
0
-
The loss function is simply 1 on error, 0 otherwise. – Attersson May 13 '18 at 21:22
-
Unless you meant expected loss, which is the expected value of the afore mentioned loss function, therefore error rate – Attersson May 13 '18 at 21:24
-
Have a look at [this answer](https://stackoverflow.com/questions/47817424/loss-accuracy-are-these-reasonable-learning-curves/47819022#47819022) (where the error rate is referred to as accuracy) – desertnaut May 15 '18 at 01:38