0

I ran my sample code using Keras.

model = Sequential([
BatchNormalization(axis=1, input_shape=(3,224,224))
Flatten(),
Dense(10, activation='softmax')])

model.compile(Adam(lr=1e-4), loss="categorical_crossentropy", metrics=['accuracy'])
model.fit_generator(batches, batches.nb_sample, nb_epoch=2, 
                   validation_data=test_batches, nb_val_samples=test_batches.nb_sample)

It gave this output:

None
Epoch 1/2
500/500 [==============================] - 147s - loss: 2.2464 - acc: 0.3520 - val_loss: 6.4765 - val_acc: 0.1100
Epoch 2/2
500/500 [==============================] - 140s - loss: 0.8074 - acc: 0.7880 - val_loss: 3.8807 - val_acc: 0.1450

I'm not able to find the meaning of loss, acc, val_loss, val_acc. Any explanation or link to the doc will be helpful.

This is closest to what I'm looking for. In above code, I'm fitting the model. But it is also giving a validation accuracy. From which data set is this validation accuracy is calculated?

James Hopkin
  • 13,797
  • 1
  • 42
  • 71
Netro
  • 7,119
  • 6
  • 40
  • 58

2 Answers2

2

Loss is the objective function that you are minimizing to train a neural network. The loss value is the mean value of the loss function across batches in the training set. Accuracy (acc) is the mean accuracy across batches, also on the training set. Accuracy is just the fraction of samples in the dataset that the model classified correctly.

But the val metrics are computed on the full validation set, which is the dataset you passed on parameter validation_data. This is done to check for overfitting during training.

Dr. Snoopy
  • 55,122
  • 7
  • 121
  • 140
0
  1. Regarding your first question: I respectfully recommend you familiarize yourself with the basic mechanics of a neural network or look into one of the many MOOCs i.e. this excellent one from fast.ai. This is also beyond the scope of this forum since it´s doesn´t seem to about programming.

  2. Your validation accuracy is calculated from the data that you provide by setting the validation_data parameter in your model.fit_generator() function. In your case you have set it to test_batches which is methodically very likely not correct. You need to split your data into three sets: One for training, one for validation (and by that seeing the progress of your training in regard to unseen data and getting useful information for tuning your hyperparameters) and one data set for testing (to evaluate the final score of your model).

One more thing: nb_val_samples is not a parameter of fit_generator anymore. See documentation here.

petezurich
  • 9,280
  • 9
  • 43
  • 57
  • Thanks, I'm following fast.ai :) I took code from https://github.com/fastai/courses/blob/master/deeplearning1/nbs/statefarm-sample.ipynb . I m using keras 1.2.2. It has nb_val_samples https://faroit.github.io/keras-docs/1.2.2/models/sequential/#fit_generator – Netro Sep 27 '17 at 07:27