0

I have another TensorFlow query:

I'm training a regression model, saving the weights and biases and then restoring them to rerun the model on a different data set. At least, that's what I'm trying to do. Not all of my weights are being restored. Here's the code used for saving my variables:

# Add ops to save and restore all the variables.
saver = tf.train.Saver({**weights, **biases})

# Save the variables to disk.
save_path = saver.save(sess, "Saved_Vars.ckpt")

And here's my entire code for restoring and running the model:

# Network Parameters
n_hidden_1 = 9
n_hidden_2 = 56
n_hidden_3 = 8
n_input = 9
n_classes = 1

# TensorFlow Graph Input
x = tf.placeholder("float", [None, n_input])

# Create Multilayer Model
def multilayer_perceptron(x, weights, biases):
    # First hidden layer with RELU activation
    layer_1 = tf.add(tf.matmul(x, weights['h1']), biases['b1'])
    layer_1 = tf.nn.relu(layer_1)

    # Second hidden layer with RELU activation
    layer_2 = tf.add(tf.matmul(layer_1, weights['h2']), biases['b2'])
    layer_2 = tf.nn.relu(layer_2)

    # Second hidden layer with RELU activation
    layer_3 = tf.add(tf.matmul(layer_2, weights['h3']), biases['b3'])
    layer_3 = tf.nn.relu(layer_3)

    # Last output layer with linear activation
    out_layer = tf.matmul(layer_3, weights['out']) + biases['out']
    return out_layer

# weights and biases
weights = {
        'h1': tf.Variable(tf.zeros([n_input, n_hidden_1])),
        'h2': tf.Variable(tf.zeros([n_hidden_1, n_hidden_2])),
        'h3': tf.Variable(tf.zeros([n_hidden_2, n_hidden_3])),
        'out': tf.Variable(tf.zeros([n_hidden_3, n_classes]))
}

biases = {
        'b1' : tf.Variable(tf.zeros([n_hidden_1])),
        'b2': tf.Variable(tf.zeros([n_hidden_2])),
        'b3': tf.Variable(tf.zeros([n_hidden_3])),
        'out': tf.Variable(tf.zeros([n_classes]))
}

# Construct Model
pred = multilayer_perceptron(x, weights, biases)
pred = tf.transpose(pred)

# Initialize variables
init = tf.global_variables_initializer()

# RUNNING THE SESSION

# launch the session
sess = tf.InteractiveSession()


# Initialize all the variables
sess.run(init)

# Add ops to save and restore all the variables.
saver = tf.train.Saver({**weights, **biases})

# Restore variables from disk.
saver.restore(sess, "Saved_Vars.ckpt")

# Use the restored model to predict the target values
prediction = sess.run(pred, feed_dict={x:dataVar_scaled}) #pred.eval(feed_dict={x:X})

Now, here's what has me confused/frustrated/annoyed. From the weights I can restore 'h1', 'h2' and 'h3', but not 'out'. Why not 'out'? Is there anything that I'm doing wrong? Please can you spend a few minutes to help me?

Many thanks

I am running Python 3.5 and TensorFlow 0.12 directly on Windows 10 and I'm using Spyder IDE.

jlt199
  • 2,349
  • 6
  • 23
  • 43

1 Answers1

0

It looks like you are over-writing one of the 'out' keys with this dictionary constructor:

{**weights, **biases}

For example:

weights = {'h1':1, 'h2':2, 'out':3}
biases = {'b1':4, 'b2':5, 'out':6}
print({**weights, **biases})

{'h2': 2, 'out': 6, 'b2': 5, 'b1': 4, 'h1': 1}
Patrick Coady
  • 216
  • 1
  • 8
  • Thank you so much, changing the variable names worked a treat! :) Please can you explain exactly what `{**weights, **biases}` is doing as I clearly don't have a good grasp? Thanks again – jlt199 Mar 02 '17 at 23:37
  • This [SO thread](http://stackoverflow.com/questions/36901/what-does-double-star-and-star-do-for-parameters) is pretty good. Also, Saver really wants a list, not a dictionary. I usually don't pass any values to Saver, then it defaults to saving anything that is savable. – Patrick Coady Mar 03 '17 at 00:31