46

I try to participate in my first Kaggle competition where RMSLE is given as the required loss function. For I have found nothing how to implement this loss function I tried to settle for RMSE. I know this was part of Keras in the past, is there any way to use it in the latest version, maybe with a customized function via backend?

This is the NN I designed:

from keras.models import Sequential
from keras.layers.core import Dense , Dropout
from keras import regularizers

model = Sequential()
model.add(Dense(units = 128, kernel_initializer = "uniform", activation = "relu", input_dim = 28,activity_regularizer = regularizers.l2(0.01)))
model.add(Dropout(rate = 0.2))
model.add(Dense(units = 128, kernel_initializer = "uniform", activation = "relu"))
model.add(Dropout(rate = 0.2))
model.add(Dense(units = 1, kernel_initializer = "uniform", activation = "relu"))
model.compile(optimizer = "rmsprop", loss = "root_mean_squared_error")#, metrics =["accuracy"])

model.fit(train_set, label_log, batch_size = 32, epochs = 50, validation_split = 0.15)

I tried a customized root_mean_squared_error function I found on GitHub but for all I know the syntax is not what is required. I think the y_true and the y_pred would have to be defined before passed to the return but I have no idea how exactly, I just started with programming in python and I am really not that good in math...

from keras import backend as K

def root_mean_squared_error(y_true, y_pred):
        return K.sqrt(K.mean(K.square(y_pred - y_true), axis=-1)) 

I receive the following error with this function:

ValueError: ('Unknown loss function', ':root_mean_squared_error')

Thanks for your ideas, I appreciate every help!

dennis
  • 707
  • 1
  • 8
  • 12

6 Answers6

74

When you use a custom loss, you need to put it without quotes, as you pass the function object, not a string:

def root_mean_squared_error(y_true, y_pred):
        return K.sqrt(K.mean(K.square(y_pred - y_true))) 

model.compile(optimizer = "rmsprop", loss = root_mean_squared_error, 
              metrics =["accuracy"])
Dr. Snoopy
  • 55,122
  • 7
  • 121
  • 140
  • 2
    Works perfectly fine, thank you very much for pointing out that mistake. I really did not think about it that way as I am kind of new to programming. You would not know by any chance how to edit this custom function so that it computes the root mean square LOGARITHMIC error, would you? – dennis May 09 '17 at 07:52
  • 1
    It gives me Unknown loss function:root_mean_squared_error – Jitesh Sep 13 '17 at 12:41
  • @Jitesh Please do not make such comments, make your own question with source code. – Dr. Snoopy Sep 13 '17 at 12:42
  • @Jitesh You're probably putting quotes around the function's name. You need to pass the function object to the compile function, not its name. – carllacan May 12 '18 at 14:50
  • you mean `metrics=['mse']`? – muon Oct 14 '18 at 19:06
  • 1
    This code gives this same value as MAE, not RMSE (see answer belowe). – Jo.Hen May 05 '20 at 20:31
  • I just updated the answer, by setting axis=None (the default), it will take the mean over all dimensions. – Dr. Snoopy May 05 '20 at 20:40
  • @muon mse stands for Mean Square Error. The difference is taken and then squared, followed by taking the mean. This is different than RMSE (Root Mean Squared Error) because the square root is taken of the whole operation of the Mean Square Error. – zipline86 May 27 '20 at 13:11
  • 2
    You should always add the import `import tensorflow.keras.backend as K` (I added it to the answer) – Bersan Mar 24 '21 at 14:37
37

The accepted answer contains an error, which leads to that RMSE being actually MAE, as per the following issue:

https://github.com/keras-team/keras/issues/10706

The correct definition should be

def root_mean_squared_error(y_true, y_pred):
        return K.sqrt(K.mean(K.square(y_pred - y_true)))
Eric Aya
  • 69,473
  • 35
  • 181
  • 253
Germán Sanchis
  • 629
  • 6
  • 6
  • 1
    Thank you very much for this comment! I spent so much time trying to figure out why my RMSE results (using code above) are this same as MAE. – Jo.Hen May 05 '20 at 20:30
17

If you are using latest tensorflow nightly, although there is no RMSE in the documentation, there is a tf.keras.metrics.RootMeanSquaredError() in the source code.

sample usage:

model.compile(tf.compat.v1.train.GradientDescentOptimizer(learning_rate),
              loss=tf.keras.metrics.mean_squared_error,
              metrics=[tf.keras.metrics.RootMeanSquaredError(name='rmse')])
Richard Xue
  • 351
  • 3
  • 7
  • I get an error when I try to use it as a loss function: `AttributeError: 'RootMeanSquaredError' object has no attribute '__name__'` even though I used the name parameter. – rjurney Nov 10 '20 at 20:52
8

I prefer reusing part of the Keras work

from keras.losses import mean_squared_error

def root_mean_squared_error(y_true, y_pred):
    return K.sqrt(mean_squared_error(y_true, y_pred))

model.compile(optimizer = "rmsprop", loss = root_mean_squared_error, 
          metrics =["accuracy"])
George C
  • 1,168
  • 13
  • 30
4

You can do RMSLE the same way RMSE is shown in the other answers, you just also need to incorporate the log function:

from tensorflow.keras import backend as K

def root_mean_squared_log_error(y_true, y_pred):
    return K.sqrt(K.mean(K.square(K.log(1+y_pred) - K.log(1+y_true))))
rbarden
  • 66
  • 3
  • 1
    note that y_pred and y_true need to be float values -> `K.sqrt(K.mean(K.square(K.log(float(y_pred+1)) - K.log(float(y_true+1)))))` – fogx Jan 24 '22 at 12:44
3

Just like before, but more simplified (directly) version for RMSLE using Keras Backend:

import tensorflow as tf
import tensorflow.keras.backend as K

def root_mean_squared_log_error(y_true, y_pred):
    msle = tf.keras.losses.MeanSquaredLogarithmicError()
    return K.sqrt(msle(y_true, y_pred)) 
Difagama
  • 41
  • 2