0

I want to have the y_pred output as either +1 or -1 only. It should not have the intermediate real values and not even zero.

classifier = Sequential()

#adding layers
# Adding the input layer and the first hidden l`enter code here`ayer
classifier.add(Dense(output_dim = 6, init = 'uniform', activation ='relu', input_shape = (22,)))
# Adding the second hidden layer classifier.add(Dense(output_dim = 6, init = 'uniform', activation = 'relu'))
# Adding the output layer
classifier.add(Dense(output_dim = 1, init = 'uniform', activation = 'tanh'))

# Compiling Neural Network
classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])

# Fitting our model 
classifier.fit(x_train, y_train, batch_size = 10, epochs = 100)

# Predicting the Test set results
y_pred = classifier.predict(x_test)

The output values of y_pred are in the range of [-1,1] but I expected values only to be either of 1 or -1.

  • Set a threshold, say 0, anything above zero is 1 and below it is -1 – Courage Jan 03 '19 at 07:22
  • @Oswald is there in modification I can do in the loss or activation parameters instead of modifying the y_pred as y_pred[y_pred > 0] = 1 y_pred[y_pred <= 0] = -1 – Umesh Desai Jan 03 '19 at 07:51

2 Answers2

0

To function properly, neural networks require an activation function that can get non-integer values. If you need rigidly discrete output, you need to translate the output values yourself.

Sami Hult
  • 3,052
  • 1
  • 12
  • 17
0

When you are implementing binary_crossentropy loss in your code, Keras automatically takes the output and applies a threshold of 0.5 to the value. This makes anything above 0.5 as 1 and anything below as 0. Unfortunately, in keras there is no easy way to change the threshold. You will have to write your own loss function.

Here is a Stackoverflow link that will guide you in doing that.