The documentation for IBM's SPSS Modeler defines neural network quality as:
For a continuous target, this 1 minus the ratio of the mean absolute error in prediction (the average of the absolute values of the predicted values minus the observed values) to the range of predicted values (the maximum predicted value minus the minimum predicted value).
Is this calculation standard?
I'm having trouble understanding how quality is derived from this.