0

So I have a class

class Trainer:
    def __init__(self,episodes):
        self.factorModel()

    def factorModel(self):
        self.model = Sequential()
        self.model.add(Conv2D(50, (3, 3), activation='relu', input_shape=(3,200,200),dim_ordering="th",strides=4))
        self.model.add(MaxPooling2D(pool_size=(2, 2), strides=(2, 2) ))
        self.model.add(Conv2D(64, (5, 5), activation='relu') )
        self.model.add(MaxPooling2D(pool_size=(2, 2) ))
        self.model.add(Dense(1000, activation='relu'))
        self.model.add(Flatten())
        self.model.add(Dense(4, activation='softmax'))
        self.model.compile(loss='categorical_crossentropy',optimizer=Adam(lr=0.01), metrics=['accuracy'])


    def do(self,state):
        self.model.predict(np.array(state))[0]

When I try to call do I got error like ValueError: Tensor Tensor("dense_2/Softmax:0", shape=(?, 4), dtype=float32) is not an element of this graph. The problem occurs when I try run do function as a thread when I use the same model and the same config but i do not run do function as a thread everything works fine

full error message

  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "path", line 141, in do
     self.model.predict_classes(state)[0]
  File "path/.local/lib/python2.7/site-packages/keras/engine/sequential.py", line 268, in predict_classes
    proba = self.predict(x, batch_size=batch_size, verbose=verbose)
  File "path/.local/lib/python2.7/site-packages/keras/engine/training.py", line 1456, in predict
    self._make_predict_function()
  File "path/.local/lib/python2.7/site-packages/keras/engine/training.py", line 378, in _make_predict_function
    **kwargs)
  File "path/.local/lib/python2.7/site-packages/keras/backend/tensorflow_backend.py", line 3009, in function
    **kwargs)
  File "path/.local/lib/python2.7/site-packages/tensorflow/python/keras/backend.py", line 3479, in function
    return GraphExecutionFunction(inputs, outputs, updates=updates, **kwargs)
  File "path/.local/lib/python2.7/site-packages/tensorflow/python/keras/backend.py", line 3142, in __init__
    with ops.control_dependencies([self.outputs[0]]):
  File "path/.local/lib/python2.7/site-packages/tensorflow/python/framework/ops.py", line 5426, in control_dependencies
    return get_default_graph().control_dependencies(control_inputs)
  File "path/.local/lib/python2.7/site-packages/tensorflow/python/framework/ops.py", line 4867, in control_dependencies
    c = self.as_graph_element(c)
  File "path/.local/lib/python2.7/site-packages/tensorflow/python/framework/ops.py", line 3796, in as_graph_element
    return self._as_graph_element_locked(obj, allow_tensor, allow_operation)
  File "path/.local/lib/python2.7/site-packages/tensorflow/python/framework/ops.py", line 3875, in _as_graph_element_locked
    raise ValueError("Tensor %s is not an element of this graph." % obj)
ValueError: Tensor Tensor("dense_2/Softmax:0", shape=(?, 4), dtype=float32) is not an element of this graph.

I tried the solution from this question link so I try to call self.model._make_predict_function() after self.factorModel() but in result i got this error InvalidArgumentError: Tensor conv2d_1_input:0, specified in either feed_devices or fetch_devices was not found in the Graph

Ok I found this question link so probably there is no way to make prediction in Thread

So I made some changes according to suggestions to the code so now it looks like this:

class Trainer:
    def __init__(self,episodes):
        self.factorModel()
        self.graph = tf.get_default_graph() 


    def factorModel(self):
        self.model = Sequential()
        self.model.add(Conv2D(50, (3, 3), activation='relu', input_shape=(3,200,200),dim_ordering="th",strides=4))
        self.model.add(MaxPooling2D(pool_size=(2, 2), strides=(2, 2) ))
        self.model.add(Conv2D(64, (5, 5), activation='relu') )
        self.model.add(MaxPooling2D(pool_size=(2, 2) ))
        self.model.add(Dense(1000, activation='relu'))
        self.model.add(Flatten())
        self.model.add(Dense(4, activation='softmax'))
        self.model.compile(loss='categorical_crossentropy',optimizer=Adam(lr=0.01), metrics=['accuracy'])


    def do(self,state):
        with self.graph.as_default():
            self.model.predict(np.array(state))[0]

and as a result I got following error

Exception in thread Thread-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "path/Desktop/marioQProject/new_class_trainer.py", line 151, in do
    self.model.predict_classes(state)[0]
  File "path/.local/lib/python2.7/site-packages/keras/engine/sequential.py", line 268, in predict_classes
    proba = self.predict(x, batch_size=batch_size, verbose=verbose)
  File "path/.local/lib/python2.7/site-packages/keras/engine/training.py", line 1462, in predict
    callbacks=callbacks)
  File "path/.local/lib/python2.7/site-packages/keras/engine/training_arrays.py", line 324, in predict_loop
    batch_outs = f(ins_batch)
  File "patha/.local/lib/python2.7/site-packages/tensorflow/python/keras/backend.py", line 3292, in __call__
    run_metadata=self.run_metadata)
  File "path/.local/lib/python2.7/site-packages/tensorflow/python/client/session.py", line 1458, in __call__
    run_metadata_ptr)
FailedPreconditionError: Error while reading resource variable conv2d_1/bias from Container: localhost. This could mean that the variable was uninitialized. Not found: Resource localhost/conv2d_1/bias/N10tensorflow3VarE does not exist.
         [[{{node conv2d_1/Reshape/ReadVariableOp}}]]

1 Answers1

0

Tensorflow is not really friendly with multithread but there's a workaround.

Do this

class Trainer:
    def __init__(self):
        self.factorModel()
        self.graph = tf.get_default_graph()  # [1]

    def do(self, state):
        with self.graph.as_default():  # [2]
            return self.model.predict(np.array(state))[0]

    def factorModel(self):
        self.model = Sequential()
        self.model.add(Conv2D(50, (3, 3), activation='relu', input_shape=(10, 10, 3), strides=4))
        self.model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])

t = Trainer()
def fn():
    t.do(np.zeros((1, 10, 10, 3)))

if __name__ == '__main__':
    thread_one = threading.Thread(target=fn)
    thread_two = threading.Thread(target=fn)
    thread_one.start()
    thread_two.start()

BTW if you don't specifically need channel first ordering then I recommend you to use TF standard channel last instead. Weather you get images directly with opencv or convert Pillow images to ndarray using numpy you'll get channel last by default.

Edit

Have you tried to make sure the model works before send to threading, like

class Trainer:
    def __init__(self, episodes, model, graph):
        self.graph = graph
        self.model = model


model = Sequential()
model.add(Conv2D(...))
.
.
.
# make sure it runs here
model.predict(np.zeros((1, 3, 200, 200)))
# if you don't need to train then try not compile first
graph = tf.get_default_graph()
trainer = Trainer(episodes, model, graph)

Also the callable model instead of Sequential, like

from keras import models, layers
inp = layers.Input((200, 200, 3))
x = layers.Conv2D(50, (3, 3), activation='relu',strides=4)(inp)
x = layers.MaxPooling2D(pool_size=(2, 2), strides=(2, 2) )(x)
x = layers.Conv2D(64, (5, 5), activation='relu')(x)
.
.
.
x = layers.Dense(4, activation='softmax')(x)
model = models.Model(inp, x)
Natthaphon Hongcharoen
  • 2,244
  • 1
  • 9
  • 23
  • So I try your solution and got following error `FailedPreconditionError: Error while reading resource variable conv2d_1/bias from Container: localhost. This could mean that the variable was uninitialized. Not found: Resource localhost/conv2d_1/bias/N10tensorflow3VarE does not exist.` I guess it can be problem with my environment but idk – saint_burrito Dec 30 '19 at 13:02
  • Could you add the new traceback and your code to main question, especially the former – Natthaphon Hongcharoen Dec 30 '19 at 13:17
  • everything is up to date – saint_burrito Dec 30 '19 at 14:05
  • I assume that normal `model.predict` doesn't work either as your traceback is `self.model.predict_classes`. I added something I can think of. – Natthaphon Hongcharoen Dec 30 '19 at 14:41