2

I am using keras' pre-trained model and the error came up when trying to get predictions. I have the following code in flask server:

from NeuralNetwork import *

@app.route("/uploadMultipleImages", methods=["POST"])
def uploadMultipleImages():
    uploaded_files = request.files.getlist("file[]")
    getPredictionfunction = preTrainedModel["VGG16"]

    for file in uploaded_files:
        path = os.path.join(STATIC_PATH, file.filename)
        result = getPredictionfunction(path)

This is what I have in my NeuralNetwork.py file:

vgg16 = VGG16(weights='imagenet', include_top=True)
def getVGG16Prediction(img_path):

    model = vgg16
    img = image.load_img(img_path, target_size=(224, 224))
    x = image.img_to_array(img)
    x = np.expand_dims(x, axis=0)
    x = preprocess_input(x)

    pred = model.predict(x) #ERROR HERE
    return sort(decode_predictions(pred, top=3)[0])

preTrainedModel["VGG16"] = getVGG16Prediction

However, running this code below, does not create any error:

if __name__ == "__main__":
    STATIC_PATH = os.getcwd()+"/static"
    print(preTrainedModel["VGG16"](STATIC_PATH+"/18.jpg"))

Here is the full error: enter image description here

Any comment or suggestion is greatly appreciated. Thank you.

matchifang
  • 5,190
  • 12
  • 47
  • 76

2 Answers2

3

Considering the backend is set to tensorflow. You should set the Keras session to the graph of tensorflow

from tensorflow import Graph, Session
from keras import backend 
model = 'model path'
graph1 = Graph()
with graph1.as_default():
    session1 = Session(graph=graph1)
    with session1.as_default():
        model_1.load_model(model) # depends on your model type

model2 = 'model path2'
graph2 = Graph()
with graph2.as_default():
    session2 = Session(graph=graph2)
    with session2.as_default():
        model_2.load_model(model2) # depends on your model type

and for predicting

K.set_session(session#)
with graph#.as_default():
    prediction = model_#.predict(img_data)
Sahil Shah
  • 96
  • 7
  • Worked for me, just had to change `from keras import backend` to 'from keras import backend as K` – hru_d Feb 06 '20 at 19:07
1

EDIT: What I wrote below doesn't seem to work when deploying the application (I was just testing locally until now). The model in app.config is loaded too often it seems. (at each request?)

Coincidentally I had the same problem yesterday. It seems there are some problems between the interaction of TensorFlow and Flask. Unfortunately I don't know enough about the internals of either one to really understand the problem but I can provide a hack which helped me get it to work. (Note: I was using Python3 but I don't think this makes a difference here.)

The problem seems to occur when initializing a model in the global namespace of your flask application. Therefore I loaded the model directly into app.config:

app.config.update({"MODEL":VGG16(weights='imagenet', include_top=True)})
# ...
app.config["MODEL"].predict(x)

Maybe you can load the model in your server.py instead of your NeuralNetwork.py and pass it to getVGG16Prediction together with img_path?

leyhline
  • 11
  • 2
  • 4
    In my case, the problem was because flask was loading tensor flow twice. When I changed app.run(debug=False) then it works, but I do not know why. Here is the link to that question: http://stackoverflow.com/questions/42015797/is-tensorflow-loading-twice-in-pycharm. I'm not sure if it's the same in your case, but I hope it helps. Thank you for your suggestion :) – matchifang Feb 08 '17 at 14:59