2

When I train the model,I can't get my accuracy

model.fit(train, train_label,
          batch_size=64,
          epochs=12,
          verbose=1,
          validation_data=(test, test_label))

when I train the data,it shows like that:

Epoch 1/12
8000/8000 [==============================] - 166s 21ms/step - loss: 0.4743 - val_loss: 0.2727

It shows the number of loss, but no accuracy.

When I evaluate,

score = model.evaluate(test, test_label, verbose=0)
print('Test loss:', score[0])
print('Test accuracy:', score[1])

it tells me that:

 IndexError: invalid index to scalar variable.

I don't know why the score is a scalar variable.

How can I get my accuracy?

desertnaut
  • 57,590
  • 26
  • 140
  • 166
天气君
  • 87
  • 2
  • 9

1 Answers1

8

When you compile a Keras model, you specify the metrics you want to monitor on this model. From the documentation:

model.compile(loss='mean_squared_error',
              optimizer='sgd',
              metrics=['mae', 'acc'])

Here we have specified that we would like the model to output Mean Absolute Error (mae) and Accuracy (acc).

By default your model only tracks loss. You have only one metric, therefore the result of .evaluate is a single number (scalar) and doesn't support indexing, hence the error.

desertnaut
  • 57,590
  • 26
  • 140
  • 166
mari.mts
  • 683
  • 5
  • 9
  • 1
    Good job (+1), but notice that in regression settings (i.e. `mse`or `mae` loss) we normally don't use accuracy, which is a classification setting & meaningless in regression; see [What function defines accuracy in Keras when the loss is mean squared error (MSE)?](https://stackoverflow.com/questions/48775305/what-function-defines-accuracy-in-keras-when-the-loss-is-mean-squared-error-mse/48788577#48788577). Didn't know though that this is how it is suggested in Keras docs - I'll open an issue at Github... – desertnaut Mar 24 '19 at 10:19