0

Caffe accuracy.layer accuracy error when I revise the accuracy for my regression project : the accuracy.layer code to modify accuracy:

for (int i = 0; i < outer_num_; ++i) 
  for (int j = 0; j < inner_num_; ++j) {
    Distance = sqrt((bottom_data[i * dim + j] - bottom_label[i * inner_num_ + j])*(bottom_data[i * dim + j] - bottom_label[i * inner_num_ + j]));   
    if (Distance <= 10) {           
         ++accuracy;        
    }
  }
}

but the result is:

I1008 22:14:37.701171 102764 caffe.cpp:286] Loss: 70993.9
I1008 22:14:37.701171 102764 caffe.cpp:298] accuracy = -1.#IND

here is my net.prototxt:

    layer {
        name: "framert"
        type: "HDF5Data"
        top: "data"
        top: "label"
        include {
            phase: TRAIN
        }
        hdf5_data_param {
            source: "G:/lab-zhang/caffe-windows/data/csv/train_data_list.txt"
            batch_size: 10
        }
    }
    layer {
        name: "inner1"
        type: "InnerProduct"
        bottom: "data"
        top: "inner1"
        param {
            lr_mult: 1
            decay_mult: 1.5
        }
        param {
            lr_mult: 2
            decay_mult: 0
        }
        inner_product_param {
            num_output: 50
            weight_filler {
                type: "xavier"
            }
            bias_filler {
                type: "constant"
                value: 0.1
            }
        }
    }
    layer {
        name: "inner2"
        type: "InnerProduct"
        bottom: "inner1"
        top: "inner2"
        param {
            lr_mult: 1
            decay_mult: 1.0
        }
        param {
            lr_mult: 2
            decay_mult: 0
        }
        inner_product_param {
            num_output: 1   
            weight_filler {
                type: "xavier"
            }
            bias_filler {
                type: "constant"
                value: 0.1
            }
        }
    }
    layer {
        name: "relu1"
        type: "ReLU"
        bottom: "inner2"
        top: "inner2"
        relu_param {
            engine: CAFFE
        }
    }
    layer {
        name: "accuracy"
        type: "Accuracy"
        bottom: "inner2"
        bottom: "label"
        top: "accuracy"
        include {
            phase: TEST
        }
    }
    layer {
        name: "loss"
        type: "EuclideanLoss"
        bottom: "inner2"
        bottom: "label"
        top: "loss"
    }

What is the reason for the wrong result: accuracy:-1.#IND ? here is my net.prototxt:

layer {
    name: "framert"
    type: "HDF5Data"
    top: "data"
    top: "label"
    include {
        phase: TRAIN
    }
    hdf5_data_param {
        source: "G:/lab-zhang/caffe-windows/data/csv/train_data_list.txt"
        batch_size: 10
    }
}
layer {
    name: "inner1"
    type: "InnerProduct"
    bottom: "data"
    top: "inner1"
    param {
        lr_mult: 1
        decay_mult: 1.5
    }
    param {
        lr_mult: 2
        decay_mult: 0
    }
    inner_product_param {
        num_output: 50
        weight_filler {
            type: "xavier"
        }
        bias_filler {
            type: "constant"
            value: 0.1
        }
    }
}
layer {
    name: "inner2"
    type: "InnerProduct"
    bottom: "inner1"
    top: "inner2"
    param {
        lr_mult: 1
        decay_mult: 1.0
    }
    param {
        lr_mult: 2
        decay_mult: 0
    }
    inner_product_param {
        num_output: 1   
        weight_filler {
            type: "xavier"
        }
        bias_filler {
            type: "constant"
            value: 0.1
        }
    }
}
layer {
    name: "relu1"
    type: "ReLU"
    bottom: "inner2"
    top: "inner2"
    relu_param {
        engine: CAFFE
    }
}
layer {
    name: "accuracy"
    type: "Accuracy"
    bottom: "inner2"
    bottom: "label"
    top: "accuracy"
    include {
        phase: TEST
    }
}
layer {
    name: "loss"
    type: "EuclideanLoss"
    bottom: "inner2"
    bottom: "label"
    top: "loss"
}
Shai
  • 111,146
  • 38
  • 238
  • 371

1 Answers1

0

The accuracy you get: -1.#IND means the value your code computes is not a number (NaN).
Why you get NaN is unclear from the code you posted. I suspect you changed too much of the accuracy layer code and introduced a bug that led to Nan.
Make sure you do not forget to update count and that you make sure you update top[0]->mutable_cpu_data()[0] with the computed accuracy.


In general, it is best not to override existing layers, but rather write new ones with the desired functionality.
When writing a new layer, please follow the guidelines in caffe wiki and in this git issue. Specifically, write a test for your layer!

Shai
  • 111,146
  • 38
  • 238
  • 371
  • 1
    Thanks for you help! I update 'top[0]->mutable_cpu_data()[0] = accuracy / count;'. And I will try to create my own accuracy.layer – Xiaoxiangding Oct 13 '16 at 02:56