1

I wanna use predictions in real time, but I want to test that the results are the same as in the case of giving all the data as input only once. This is the code:

import numpy as np
from keras.layers import Input, Dense, SimpleRNN
from keras.models import Model

#MODEL 1, for training
x = Input(shape=(None, 1))
h = SimpleRNN(30, return_sequences=True)(x)
out = Dense(1)(h)
model = Model(x, out)
model.compile(loss='mse', optimizer='adam')

X_train = np.random.rand(100, 50, 1)
y_train = np.random.rand(100, 50, 1)
model.fit(X_train, y_train, verbose = False)

#MODEL 1, for predictions in real time
x = Input(batch_shape=(1, None, 1))
h = SimpleRNN(30, stateful=True, return_sequences=True)(x)
out = Dense(1)(h)
predict_model = Model(x, out)
predict_model.set_weights(model.get_weights())

X = np.random.rand(2, 2, 1)
predictions = model.predict(X, verbose = False)
for sim in range(len(predictions)):
    for i in range(len(predictions[0])):
        pred = predict_model.predict(X[sim:(sim+1), i:(i + 1), :], verbose = False)
        print(pred[0][0]) #Predictions in real time
        print(predictions[sim][i]) #Predictions with MODEL 1
        print()
    predict_model.reset_states()

It prints this:

[0.09156141]

[0.09156139]

[-0.38076958]

[-0.38076955]

[0.12214336]

[0.12214339]

[-0.52013564]

[-0.5201356]

The results must be exactly the same because both have the same weights. What is happening?

  • It may be a floating point error. https://stackoverflow.com/questions/249467/what-is-a-simple-example-of-floating-point-rounding-error – imM4TT Jan 23 '23 at 16:26
  • But in case the error is due to possible floating point values, the values of both predictions should be rounded to the same value. – Jose Gonzalez B Jan 23 '23 at 16:38
  • Yeah I think you are right. It's possibily a part of randomness in the SimpleRNN Keras model then ? – imM4TT Jan 23 '23 at 16:46
  • You could restart your program to check if the result are differents from the previous one or not – imM4TT Jan 23 '23 at 16:48
  • Once the model is trained, randomness does not intervene. – Jose Gonzalez B Jan 23 '23 at 17:42
  • Yes, if I restart the program i get different results because the input changes (i am not using the same seed). But the problem persists. Sometimes both values match – Jose Gonzalez B Jan 23 '23 at 17:47
  • I have tested the code with the same code and the problem persists. If I write this: import tensorflow as tf np.random.seed(0) tf.random.set_seed(0) I always get this: [0.09520281] [0.09520283] [0.33807808] [0.33807802] [0.0579582] [0.05795821] [0.18572299] [0.18572299] – Jose Gonzalez B Jan 24 '23 at 09:46
  • Right. I think there are layers that include a part of randomness such as a dropout layer that will make prediction have differents result from the same model but in your case I don't think that's due to a dropout layer. (there aren't dropout layer in the SimpleRNN by default) – imM4TT Jan 24 '23 at 13:00

0 Answers0