0

I am fairly new to machine learning but I have put together a LSTM network for educational purposes that seems to be working fairly well.

I have not been able to fully understand the numerical ranges for input and output variables. I normalized my input and training data so all variables are centered at 0 with a standard deviation of 1. When I test the network, all of my predictions are positive between 0 and 1, there are never any negative values even though the training data contained negative values.

I have worked around this by creating one output for positive numbers and another for negative in my training data. For example:

Original training data:

data
-1.0
-0.5
0.0
0.5
1.0

becomes:

pos_data   neg_data
0.0        1.0
0.0        0.5
0.0        0.0
0.5        0.0
1.0        0.0

After I run the model, I convert the pos_data and neg_data back to a single column with positive and negative values. This seems to work, but feels like it should be unnecessary.

Does Keras allow negative values in the input or training data? If so, does anyone have any ideas why I would only be getting positive predictions when the model was trained with both positive and negative values?

Thank you!

1 Answers1

1

Does Keras allow negative values in the input or training data?

Yes, a good example would be BERT word embeddings while doing natural language processing. Some data scalers are on the interval -1 to 1

If so, does anyone have any ideas why I would only be getting positive predictions when the model was trained with both positive and negative values?

If your model is a single LSTM cell then your output is run through softmax before you are given your output.

enter image description here

Image Provided by Rosand Liu

The range of the softmax function is between 0 and 1. Which would explain why you are getting only positive values!

raceee
  • 477
  • 5
  • 14
  • I had two LSTM layers, but was specifying 'sigmoid' as the activation function in the dense layer. I switched that over to 'tanh' and am getting the output I expected. Thank you! – Andrew Winter Feb 20 '20 at 21:52