0
```

split = self.data['sold'].size - self.length

if (self.showGraph):

    self.data.plot()
    plt.show()


self.train = self.data[:split]

self.test = self.data[split:]

# trainDup = self.train 
# self.train = trainDup.append(self.train)


self.scaler = MinMaxScaler()
self.trainS = self.scaler.fit_transform(self.train)
self.testS = self.scaler.transform(self.test)

self.batch_size = 1
self.trainGen = TimeseriesGenerator(self.trainS,targets=self.trainS,length=self.length,batch_size=1)

self.testGen = TimeseriesGenerator(self.testS,targets=self.testS,length=self.length-1,batch_size=1)

self.data.isnull()

batches = int(self.data['sold'].size / self.batch_size)
self.input_shape = (self.length,self.train.columns.values.size)
self.batch_input_shape = (self.batch_size,self.length,self.train.columns.values.size)

self.model = Sequential()


self.model.add(SimpleRNN(
    12,return_sequences=False,input_shape=self.input_shape
))

self.model.add(Activation('relu'))

self.model.add(Dense(1))

self.model.add(Activation('relu'))

def loss(value,pred):
    from sklearn.metrics import mean_absolute_error
    return mean_absolute_error(value,pred)


self.model.compile(optimizer=Adam(learning_rate=0.001),loss='mse',metrics=['accuracy'])

print(self.model.summary())

```

Predictions Are Orange Real Results Are Blue Year Graph. enter image description here

I have Tried LSTM,RNNsimple:

and many different varionts off these I have tried

1024-6 units I have tried dropout and without scaler I have changed batch_size used L2,L1 regualizers for fitting, I have changed the data from weekly,monthly,yearly,daily I have tried a duplicating technique to have 10000 data rows I have tried Using ANN, and batch_input_shape and so much more. But I can't find a soultion ATM.

My data count:

Daily = 1895
Yearly = 1895 / 365
Monthly = 1895 / 12
Weekly = 1895 / 7

Duplicated = 20100 AND I HAVE TRIED SHUFFLING.
  • Please Help me. I can send my LOSS and VAL_LOSS GRAPH ASWELL./ – Marcus Rose Apr 04 '21 at 13:38
  • Try feeding differences rather than the signal itself. see https://stackoverflow.com/questions/65205506/lstm-autoencoder-problems – Gulzar Apr 11 '21 at 18:00
  • I tried it but I'm kinda stuck between using LSTM or SImpleRNN which gives better results than a straight line but LSTM gives a straight line, it's kind of upside down for vanishing gradient techniques. – Marcus Rose Apr 14 '21 at 09:39
  • model.add(LSTM(100,return_sequences=True,input_shape=(9,1) )) model.add(Dropout(0.5)) model.add(Activation('elu')) #Am I doing Something Wrong Here – Marcus Rose Apr 14 '21 at 09:46
  • Try to make the LSTM learn the difference from a running exponential mean – Gulzar Apr 14 '21 at 10:50
  • I have a SimpleRNN soultion but it seems to have drawbacks when it comes to a certain amount of epochs it starting to fit then vanishing the gradient is there a way round this. – Marcus Rose Apr 16 '21 at 08:15

1 Answers1

0

The Solution was length and alot of vanishing gradient, problems as LSTM only slows down vanshing gradient but I have changed to classification as the data is to noisy.