0

I am trying to generate synthetic data with tensorflow autoencoder that is very close to given original data. But, autoencoder is not learning in training phase. My cost function is not decreasing generally and synthetic data unrelated with original data. My code is given below:

x = tf.placeholder("float", [None, COLUMN])
# Weights and biases to hidden layer
Wh = tf.Variable(tf.random_uniform((COLUMN, UNITS_OF_HIDDEN_LAYER), -1.0 / mpmath.sqrt(COLUMN), 1.0 / mpmath.sqrt(COLUMN)))
bh = tf.Variable(tf.zeros([UNITS_OF_HIDDEN_LAYER]))
h = tf.nn.sigmoid(tf.matmul(x, Wh) + bh)
# Weights and biases to output layer
Wo = tf.transpose(Wh) # tied weights
bo = tf.Variable(tf.zeros([COLUMN]))
y = tf.nn.sigmoid(tf.matmul(h, Wo) + bo)

# Objective functions
cross_entropy = tf.reduce_mean(tf.pow(x - y, 2))
optimizer = tf.train.RMSPropOptimizer(LEARNING_RATE).minimize(cross_entropy)
init = tf.global_variables_initializer()
sess = tf.Session()
sess.run(init)

train_number, _ = x_train.shape


for j in range(TRAINING_EPOCHS):
    sample = np.random.randint(train_number, size=BATCH_SIZE)
    batch_xs = x_train[sample][:]
    _, cost = sess.run([optimizer, cross_entropy], feed_dict={x: batch_xs})
    print("COST: ", cost)

encodedTensor = tf.nn.sigmoid(tf.add(tf.matmul(x_train, Wh), bh))
encodedData = sess.run(encodedTensor)

decodedTensor = tf.nn.sigmoid(tf.add(tf.matmul(encodedData, Wo), bo))
decodedData = sess.run(decodedTensor)
return decodedData
user3104352
  • 1,100
  • 1
  • 16
  • 34

2 Answers2

0

have you tried different weight initialization like

Wh = tf.Variable(tf.random_normal([n_inputs,n_hiddens1],0,0.1),name='W1')
bh = tf.Variable(tf.random_normal([n_hiddens1],0,0.1),name='b1')

this works find for my code,further more maybe

 Wo = tf.transpose(Wh) # tied weight

Is not such a good idea because in the gradient descent tensorflow might see those as the same variable or dependent variable and modify them at every iteration of the optimization

anyway if you use tensorboard you will see if one of these two bug option is true

J.Zagdoun
  • 124
  • 1
  • 16
0

Your loss function is MSE. It would be better if you switch to cross-entropy loss for auto-encoders. Here you can find a detailed information about it link.

Deniz Beker
  • 1,984
  • 1
  • 18
  • 22