1

I trained FCN32 from the scratch on my data, unfortunately I am getting a black image as output. Here is the loss curve. enter image description here I am not sure whether this training loss curve is normal or not, or whether I have done something wrong or not.

I really appreciate experts'idea on this. And

  1. why the output is a black image?
  2. Is the network overfitting?
  3. Should I change lr_mult value in Deconvolution layer, from 0 to any other value? Thanks a lot

Edited: I changed the lr_mult value in Deconvolution layer, from 0 to 3 and the following shows the solver:

test_interval: 1000 #1000000 
display: 100
average_loss: 100
lr_policy: "step"
stepsize: 100000    
gamma: 0.1
base_lr: 1e-7
momentum: 0.99
iter_size: 1
max_iter: 500000
weight_decay: 0.0005

I got the following train-loss curve and again I am getting black image. I do not know what is the mistake and why it is behaving like this, could someone please share some ideas? Thanks enter image description here

S.EB
  • 1,966
  • 4
  • 29
  • 54
  • 1
    what exactly do you mean by "black image"? are you certain all values are exactly zero? how many labels do you have in your model? – Shai Mar 06 '17 at 21:18

2 Answers2

1

There is an easy way to check if you are overfitting on the training data or just did something wrong in the algorithm. Just predict on the training data and look at the output. If this is very similar or equal to the desired output you are overfitting and you will probably have to apply dropout and weight regularization.

If the output is also black on the training data your labels or your optimization metric is probably wrong.

Thomas Pinetz
  • 6,948
  • 2
  • 27
  • 46
0

Should I change lr_mult value in Deconvolution layer, from 0 to any other value?

lr_mult = 0 means this layer does not learn (source, source 2). If you want that layer to learn, you should better set it to a positive value. Depending on your initialization, this might very well be the reason why the image is black.

Community
  • 1
  • 1
Martin Thoma
  • 124,992
  • 159
  • 614
  • 958