The loss function is defined as shown in the link The training is based on two different dataset i and k, which corresponds to the two different parts of the loss function. I would like to train using
for epoch in range(training_epochs):
avg_cost = 0.
total_batch = int(total_len1/batch_size)
# Loop over all batches
for i in range(total_batch-1):
batch_x = X_train1[i*batch_size:(i+1)*batch_size]
batch_y = Y_train1[i*batch_size:(i+1)*batch_size]
# Run optimization op (backprop) and cost op (to get loss value)
_, c, p = sess.run([optimizer, cost, pred], feed_dict={x: batch_x,
y: batch_y})
# Compute average loss
avg_cost += c / total_batch
But I am struggling with how to implement it for training with such loss function. Thanks in advance.