I've been implementing an autoencoder which receives as inputs vectors that consist only of 0 and 1, such as [1, 0, 1, 0, 1, 0, ...].
Likewise, another autoencoder that receives as inputs vectors that consist in values between 0 and 1, such as [0.123, 1, 0.9, 0.01, 0.9, ...]. In both cases each vector element is the input value of a node. The activation function of the hidden layers is relu and for the output layer is sigmoid.
I've seen some examples of autoencoders where adam/adadelta
are used as optimizer and binary_crossentropy is used as a loss function. For that reason I implemented in both adadelta
and binary_crossentropy
, but I'm not sure if for both cases it's the correct configuration.