4

I am trying to implement the asynchronous version of actor-critic in Keras and TensorFlow. I am using Keras just as a front-end for building my network layers (I am updating the parameters directly with tensorflow). I have a global_model and one main tensorflow session. But inside each thread I am creating a local_model which copies parameters from the global_model. My code looks something like this

def main(args):
    config=tf.ConfigProto(log_device_placement=False,allow_soft_placement=True)
    sess = tf.Session(config=config)
    K.set_session(sess) # K is keras backend
    global_model = ConvNetA3C(84,84,4,num_actions=3)

    threads = [threading.Thread(target=a3c_thread, args=(i, sess, global_model)) for i in range(NUM_THREADS)]

    for t in threads:
        t.start()

def a3c_thread(i, sess, global_model):
    K.set_session(sess) # registering a session for each thread (don't know if it matters)
    local_model = ConvNetA3C(84,84,4,num_actions=3)
    sync = local_model.get_from(global_model) # I get the error here

    #in the get_from function I do tf.assign(dest.params[i], src.params[i])

I get a user warning from Keras

UserWarning: The default TensorFlow graph is not the graph associated with the TensorFlow session currently registered with Keras, and as such Keras was not able to automatically initialize a variable. You should consider registering the proper session with Keras via K.set_session(sess)

followed by a tensorflow error on the tf.assign operation saying operations must be on the same graph.

ValueError: Tensor("conv1_W:0", shape=(8, 8, 4, 16), dtype=float32_ref, device=/device:CPU:0) must be from the same graph as Tensor("conv1_W:0", shape=(8, 8, 4, 16), dtype=float32_ref)

I am not exactly sure what is going wrong.

Thanks

user1955184
  • 599
  • 1
  • 5
  • 18

1 Answers1

5

The error comes from Keras because tf.get_default_graph() is sess.graph is returning False. From the TF docs, I see that tf.get_default_graph() is returning the default graph for the current thread. The moment I start a new thread and create a graph, it is built as a separate graph specific to that thread. I can solve this issue by doing the following,

with sess.graph.as_default():
   local_model = ConvNetA3C(84,84,4,3)
user1955184
  • 599
  • 1
  • 5
  • 18