I have the train and label data as data.mat
. (I have 200 training data with 6000 features and labels are (-1, +1) that have saved in data.mat).
I am trying to convert my data (train and test) in hdf5
and run Caffe using:
load input.mat
hdf5write('my_data.h5', '/new_train_x', single( permute(reshape(new_train_x,[200, 6000, 1, 1]),[4:-1:1] ) ));
hdf5write('my_data.h5', '/label_train', single( permute(reshape(label_train,[200, 1, 1, 1]), [4:-1:1] ) ) , 'WriteMode', 'append' );
hdf5write('my_data_test.h5', '/test_x', single( permute(reshape(test_x,[77, 6000, 1, 1]),[4:-1:1] ) ));
hdf5write('my_data_test.h5', '/label_test', single( permute(reshape(label_test,[77, 1, 1, 1]), [4:-1:1] ) ) , 'WriteMode', 'append' );
(See this thread regarding converting mat-files to hdf5 in Matlab).
My train_val.prototxt
is:
layer {
type: "HDF5Data"
name: "data"
top: "new_train_x" # note: same name as in HDF5
top: "label_train" #
hdf5_data_param {
source: "file.txt"
batch_size: 20
}
include { phase: TRAIN }
}
layer {
type: "HDF5Data"
name: "data"
top: "test_x" # note: same name as in HDF5
top: "label_test" #
hdf5_data_param {
source: "file_test.txt"
batch_size: 20
}
include { phase:TEST }
}
layer {
name: "ip1"
type: "InnerProduct"
bottom: "new_train_x"
top: "ip1"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
inner_product_param {
num_output: 30
weight_filler {
type: "gaussian" # initialize the filters from a Gaussian
std: 0.01
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "tanh1"
type: "TanH"
bottom: "ip1"
top: "tanh1"
}
layer {
name: "ip2"
type: "InnerProduct"
bottom: "tanh1"
top: "ip2"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
inner_product_param {
num_output: 1
weight_filler {
type: "gaussian" # initialize the filters from a Gaussian
std: 0.01
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "loss"
type: "TanH"
bottom: "ip2"
bottom: "label_train"
top: "loss"
}
But I have a problem. It seems, it cannot read my input data.
I1227 10:27:21.880826 7186 layer_factory.hpp:76] Creating layer data
I1227 10:27:21.880851 7186 net.cpp:110] Creating Layer data
I1227 10:27:21.880866 7186 net.cpp:433] data -> new_train_x
I1227 10:27:21.880893 7186 net.cpp:433] data -> label_train
I1227 10:27:21.880915 7186 hdf5_data_layer.cpp:81] Loading list of HDF5 filenames from: file.txt
I1227 10:27:21.880965 7186 hdf5_data_layer.cpp:95] Number of HDF5 files: 1
I1227 10:27:21.962596 7186 net.cpp:155] Setting up data
I1227 10:27:21.962702 7186 net.cpp:163] Top shape: 20 6000 1 1 (120000)
I1227 10:27:21.962738 7186 net.cpp:163] Top shape: 20 1 1 1 (20)
I1227 10:27:21.962772 7186 layer_factory.hpp:76] Creating layer ip1
I1227 10:27:21.962838 7186 net.cpp:110] Creating Layer ip1
I1227 10:27:21.962873 7186 net.cpp:477] ip1 <- new_train_x
I1227 10:27:21.962918 7186 net.cpp:433] ip1 -> ip1
I1227 10:27:21.979375 7186 net.cpp:155] Setting up ip1
I1227 10:27:21.979434 7186 net.cpp:163] Top shape: 20 30 (600)
I1227 10:27:21.979478 7186 layer_factory.hpp:76] Creating layer tanh1
I1227 10:27:21.979529 7186 net.cpp:110] Creating Layer tanh1
I1227 10:27:21.979557 7186 net.cpp:477] tanh1 <- ip1
I1227 10:27:21.979583 7186 net.cpp:433] tanh1 -> tanh1
I1227 10:27:21.979620 7186 net.cpp:155] Setting up tanh1
I1227 10:27:21.979650 7186 net.cpp:163] Top shape: 20 30 (600)
I1227 10:27:21.979670 7186 layer_factory.hpp:76] Creating layer ip2
I1227 10:27:21.979696 7186 net.cpp:110] Creating Layer ip2
I1227 10:27:21.979720 7186 net.cpp:477] ip2 <- tanh1
I1227 10:27:21.979746 7186 net.cpp:433] ip2 -> ip2
I1227 10:27:21.979796 7186 net.cpp:155] Setting up ip2
I1227 10:27:21.979825 7186 net.cpp:163] Top shape: 20 1 (20)
I1227 10:27:21.979854 7186 layer_factory.hpp:76] Creating layer loss
I1227 10:27:21.979881 7186 net.cpp:110] Creating Layer loss
I1227 10:27:21.979909 7186 net.cpp:477] loss <- ip2
I1227 10:27:21.979931 7186 net.cpp:477] loss <- label_train
I1227 10:27:21.979962 7186 net.cpp:433] loss -> loss
F1227 10:27:21.980006 7186 layer.hpp:374] Check failed: ExactNumBottomBlobs() == bottom.size() (1 vs. 2) TanH Layer takes 1 bottom blob(s) as input.
*** Check failure stack trace: ***
@ 0x7f44cbc68ea4 (unknown)
@ 0x7f44cbc68deb (unknown)
@ 0x7f44cbc687bf (unknown)
@ 0x7f44cbc6ba35 (unknown)
@ 0x7f44cbfd0ba8 caffe::Layer<>::CheckBlobCounts()
@ 0x7f44cbfed9da caffe::Net<>::Init()
@ 0x7f44cbfef108 caffe::Net<>::Net()
@ 0x7f44cc03f71a caffe::Solver<>::InitTrainNet()
@ 0x7f44cc040a51 caffe::Solver<>::Init()
@ 0x7f44cc040db9 caffe::Solver<>::Solver()
@ 0x41222d caffe::GetSolver<>()
@ 0x408ed9 train()
@ 0x406741 main
@ 0x7f44ca997a40 (unknown)
@ 0x406f69 _start
@ (nil) (unknown)
Aborted (core dumped)
Now, if i change loss layer like this:
layer {
name: "loss"
type: "TanH"
bottom: "ip2"
top: "loss"
}
I have this problem:
F1227 10:53:17.884419 9102 insert_splits.cpp:35] Unknown bottom blob 'new_train_x' (layer 'ip1', bottom index 0)
*** Check failure stack trace: ***
@ 0x7f502ab5dea4 (unknown)
@ 0x7f502ab5ddeb (unknown)
@ 0x7f502ab5d7bf (unknown)
@ 0x7f502ab60a35 (unknown)
@ 0x7f502af1d75b caffe::InsertSplits()
@ 0x7f502aee19e9 caffe::Net<>::Init()
@ 0x7f502aee4108 caffe::Net<>::Net()
@ 0x7f502af35172 caffe::Solver<>::InitTestNets()
@ 0x7f502af35abd caffe::Solver<>::Init()
@ 0x7f502af35db9 caffe::Solver<>::Solver()
@ 0x41222d caffe::GetSolver<>()
@ 0x408ed9 train()
@ 0x406741 main
@ 0x7f502988ca40 (unknown)
@ 0x406f69 _start
@ (nil) (unknown)
Aborted (core dumped)
Many thanks!!!! Any advice would be appreciated!