Hi guys I create a HDF5 dataset from rgb images using matlab with examples from caffe. Before i create the dataset I convert my images for gray scale.
The dataset is created without problems, but when i start training I got this error:
I0501 08:54:43.176820 14655 net.cpp:106] Creating Layer conv3
I0501 08:54:43.176826 14655 net.cpp:454] conv3 <- conv2
I0501 08:54:43.176831 14655 net.cpp:411] conv3 -> conv3
I0501 08:54:43.177955 14655 net.cpp:150] Setting up conv3
I0501 08:54:43.177970 14655 net.cpp:157] Top shape: 5 16 114 114 (1039680)
I0501 08:54:43.177975 14655 net.cpp:165] Memory required for data: 51416320
I0501 08:54:43.177983 14655 layer_factory.hpp:76] Creating layer loss
I0501 08:54:43.177991 14655 net.cpp:106] Creating Layer loss
I0501 08:54:43.177995 14655 net.cpp:454] loss <- conv3
I0501 08:54:43.178000 14655 net.cpp:454] loss <- label
I0501 08:54:43.178006 14655 net.cpp:411] loss -> loss
F0501 08:54:43.178031 14655 euclidean_loss_layer.cpp:12] Check failed: bottom[0]->count(1) == bottom[1]->count(1) (207936 vs. 1024) Input
*** Check failure stack trace: ***
I change the bottom of euclidean layer setting both on conv3 and works, but this is wrong right?
Btw i using this network for image reconstruction and my data is a 4D array composed of data=[12342, 1, 32, 32] and labels=[12342,1,32,32]. I change the dimensions of label array but does not work.
Someone have any idea?
Thanks.