0

Here I create the LSTM model according to my data. Then I predicted values according to the data.

Then after that I want to do , is now I want to add new input into the training model and then according to the new input I want to predict next value after one hour according to the training LSTM model.

But I don't know how to do it. Does anyone know it how to do?

Here is my code for training model.

model = Sequential()
model.add(LSTM(16, return_sequences=True,input_shape=(None,x_train_n.shape[2])))  # returns a sequence of vectors of dimension 32
model.add(LSTM(16, return_sequences=True))  # returns a sequence of vectors of dimension 32
model.add(LSTM(8))  # return a single vector of dimension 32
model.add(Dense(1))
batchsize = 32
model.compile(loss="mean_squared_error",optimizer="adam")
history = model.fit(x_train_n,y_train_n, batch_size = batchsize, nb_epoch=30,validation_data=(x_test_n, y_test_n),shuffle =True)

model.reset_states()
pred = model.predict(x_test_n)

According to the earlier data I create the LSTM neural network model.

Now I want to add new inputs into the model and I want to predict next hour value of X1

date                       x1       x2   x3   x4
2019/8/23 06:30:00         20        0   0    0

Then prediction of next x1 value at t+1 = x1 ?

According to the @ thushv89 I wrote the code:

from tensorflow.keras import models, layers

n_chars = 1
timesteps = num_time_steps
inp = layers.Input(shape=(timesteps, x_train_n.shape[2]))
lstm = layers.LSTM(100, return_sequences=True)
out1 = lstm(inp)
dense = layers.Dense(n_chars, activation='sigmoid')
out2 = layers.TimeDistributed(dense)(out1)
model = models.Model(inp, out2)
model.summary()

inp_infer = layers.Input(shape=(1, x_train.shape[1]))
# Inputs to feed LSTM states back in
h_inp_infer = layers.Input(shape=(100,))
c_inp_infer = layers.Input(shape=(100,))
# We need return_state=True so we are creating a new layer
 lstm_infer = layers.LSTM(100, return_state=True, return_sequences=True)
 out1_infer, h, c  = lstm_infer(inp_infer, initial_state=[h_inp_infer, c_inp_infer])
out2_infer = layers.TimeDistributed(dense)(out1_infer)
 import numpy as np
 x = np.random.randint(0,2,size=(1, 1, x_.shape[1]))
 h = np.zeros(shape=(1, 100))
 c = np.zeros(shape=(1, 100))
 seq_len = 10
 for _ in range(seq_len):
    print(x)
    y_pred, h, c = model_infer.predict([x, h, c])
    y_pred = x[:,0,:]
    y_onehot = np.zeros(shape=(x.shape[0],n_chars))
    y_onehot[np.arange(x.shape[0]),np.argmax(y_pred,axis=1)] = 1.0
    x = np.expand_dims(y_onehot, axis=1)

    model_infer = models.Model([inp_infer, h_inp_infer, c_inp_infer], [out2_infer, h, c])

     lstm_infer.set_weights(lstm.get_weights())
      model_infer.summary()

value came like this:

[[[0 0 1 1 1 1 1 0]]]

Here it came 0,1

According to this code I have doubts:

  1. Here1,0 is coming, then how I can inverse this value to get the values?
  2. How to add new csv file to get the x1 column prediction value?
  3. How this code will affect to predict the future value?

Can you explain these questions?

team
  • 526
  • 6
  • 20
  • 1
    I am assuming you need to use your previous input to predict a new one and use that predicted value to predict the next and so on right? I provided an answer to a similar question earlier. See if [this](https://stackoverflow.com/questions/58931024/keras-sequence-models-how-to-generate-data-during-test-generation/58932920#58932920) post helps. – thushv89 Nov 28 '19 at 05:11
  • @ thushv89 yes you are correct – team Nov 28 '19 at 05:17
  • 1
    @ thushv89 yes I will look at it . Thank you very much. – team Nov 28 '19 at 05:17
  • @thushv89 Can you explain me in you code ,when you are creating another model, why did you put 1 when you are doing shape. "inp_infer = layers.Input(shape=(1, n_chars))" – team Nov 28 '19 at 05:24
  • So, the LSTM only expects a 3d tensor (i.e. (batch size, timesteps, input_dim)) even when you have a single input. So you have to "fool" the LSTM to thinking your single input is a time series input by adding 1 for the time dimension. – thushv89 Nov 28 '19 at 05:25
  • @ thushv89 So then If I have four inputs then I have to put (4,n_chars(output) isn't it? – team Nov 28 '19 at 05:27
  • 1
    Not exactly, so your `x1, x2, x3, x4`, at the time of generation you have to consider them as individual inputs. Because really, when predicting using `x1` you shouldn't know about `x2` (because that's future). So your input size should be `(1, x1.shape[2])` – thushv89 Nov 28 '19 at 05:30
  • @ thushv89 ohh great got it. Great explanation, Let me try your code, If any error got I wil let you know, I hope you will help me. Here one thing I want to know that I have a date then how could I apply this date into train model, like set index oh, drop the date. because while I am trying to plot the graph with actual and predict value with date (time) its not plotting. Why is it coming like that? – team Nov 28 '19 at 05:36
  • I'm not sure if I follow you. Do you want to drop the date? – thushv89 Nov 28 '19 at 11:09
  • 1
    @thushv89 yes I want to drop the date, after writing my code, I will add a comment by tagging your name. – team Nov 29 '19 at 04:25
  • @thushv89 Hi, I wrote your code and got some values, But I have not any idea how this will work, how this will give future predict value of x1 , if in new csv file. Here I have some doubts and I wrote the questions above and the code. Can you look at that it, it will be very helpful to me. – team Dec 06 '19 at 09:29

0 Answers0