0

I run a python script in order to load keras, tensorflow and the keras model. Then I can start making predictions, but this takes a few seconds to load everything.

I can loop inside the python script and get good performance predicting in batches, but I want to have also good performance with via independent prediction requests from PHP.

Anyone had success approaching like this? How to make Python script run as service?

Community
  • 1
  • 1
Dídac Royo
  • 229
  • 3
  • 11
  • What do you mean 'permanently loaded'? What are you using this for? In a Python script, you can simply use `model = load_model('name_of_model.h5')` and then use `model.predict(x, batch_size=32, verbose=0)` to make predictions. The documentation for model.predict() is [here](https://keras.io/models/model/#predict) – Taivanbat Badamdorj May 08 '17 at 10:34
  • You only have to load the model once and you can predict however many times you want in the python script – Taivanbat Badamdorj May 08 '17 at 10:35
  • can you provide some sample script and a bit more explanation? Do you load your model every iteration? Do you have python permanently running? – Wilmar van Ommeren May 08 '17 at 15:04

2 Answers2

1

I found a very good solution. Converting my prediction python script into a FLASK app and running the flask app with a Python WSGI HTTP Server called Gunicorn! and it works like a charm!

Dídac Royo
  • 229
  • 3
  • 11
  • I am trying to replicate something similar. Do you mind sharing your code? –  Feb 10 '19 at 14:36
0

I've found this resource with many options to resolve this problem and to make available the trained model for a production environment https://www.reddit.com/r/MachineLearning/comments/545qei/question_about_using_pythonkeras_in_production/

Dídac Royo
  • 229
  • 3
  • 11