Hi,
i have a sequence looking like that (plus more zeros) :
[ 0, 0.66 , 0 ,0.66 ,0 ,0 ,0 ,0.55 ,0 ,0 ,0 ,3.18 ,0 ,0 ,2 ,0.6 ,0]
I have the following code in python in the same way than :
Pybrain time series prediction using LSTM recurrent nets
from pybrain.datasets import SequentialDataSet
from pybrain.tools.shortcuts import buildNetwork
from pybrain.structure.modules import LSTMLayer
from pybrain.supervised import RPropMinusTrainer
from itertools import cycle
ds = SequentialDataSet(1, 1)
for sample, next_sample in zip(train, cycle(train[1:])):
ds.addSample(sample, next_sample)
net = buildNetwork(1, 5, 1, hiddenclass=LSTMLayer, outputbias=False, recurrent=True)
trainer = RPropMinusTrainer(net, dataset=ds)
train_errors = []
EPOCHS_PER_CYCLE = 5
CYCLES = 50
EPOCHS = EPOCHS_PER_CYCLE * CYCLES
for i in range(CYCLES):
trainer.trainEpochs(EPOCHS_PER_CYCLE)
train_errors.append(trainer.testOnData())
epoch = (i+1) * EPOCHS_PER_CYCLE
print("\r epoch {}/{}".format(epoch, EPOCHS), end="")
stdout.flush()
Getting the prediction on the train set:
res=[]
for sample, target in ds.getSequenceIterator(0):
r=net.activate(sample)
res.append(r)
Then what i notice is that the network never predicts zeros, always something around 0.10. How should i tune my network in order to get the good results?
Thank you