4

I want to host a pytorch model in a fastapi backend. When I run the code with python it is working fine. the depickled model can use the defined class. When the same file is started with uvicorn it cannot find the class definition.

Sourcecode looks like this:

import uvicorn
import json
from typing import List
from fastapi import Body, FastAPI
from fastapi.encoders import jsonable_encoder
import requests
from pydantic import BaseModel

#from model_ii import Model_II_b

import dill as pickle
import torch as T
import sys

app = FastAPI()
current_model = 'model_v2b_c2_small_ep15.pkl'
verbose_model = False  # for model v2

class Model_II_b(T.nn.Module):
[...]
@app.post('/function')
def API_call(req_json: dict = Body(...)):
    try:
        # load model...
        model = pickle.load(open('models/' + current_model, 'rb'))
        result = model.dosomething_with(req_json)

        return result

    except Exception as e:
        raise e
        return {"error" : str(e)}

if __name__ == "__main__":
    uvicorn.run(app, host="0.0.0.0", port=8000)

When I run this with python main.py it is working fine and I am gettings results. When I run it with uvicorn main:app and send a request I get the following error:

AttributeError: Can't get attribute 'Model_II_b' on <module '__mp_main__' from '/opt/webapp/env/bin/uvicorn'>

both should be using the same python env as I use the uvicorn from within the env.

I hope someone has an idea what is wrong with my setup or code.

Update Stacktrace:

(model_2) root@machinelearning-01:/opt/apps# uvicorn main:app --env-file /opt/apps/env/pyvenv.cfg --reload
INFO:     Loading environment from '/opt/apps/env/pyvenv.cfg'
INFO:     Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO:     Started reloader process [164777] using statreload
INFO:     Started server process [164779]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     127.0.0.1:33872 - "POST /ml/v2/predict HTTP/1.1" 500 Internal Server Error
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/opt/apps/env/lib/python3.6/site-packages/uvicorn/protocols/http/httptools_impl.py", line 385, in run_asgi
    result = await app(self.scope, self.receive, self.send)
  File "/opt/apps/env/lib/python3.6/site-packages/uvicorn/middleware/proxy_headers.py", line 45, in __call__
    return await self.app(scope, receive, send)
  File "/opt/apps/env/lib/python3.6/site-packages/fastapi/applications.py", line 183, in __call__
    await super().__call__(scope, receive, send)  # pragma: no cover
  File "/opt/apps/env/lib/python3.6/site-packages/starlette/applications.py", line 102, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/opt/apps/env/lib/python3.6/site-packages/starlette/middleware/errors.py", line 181, in __call__
    raise exc from None
  File "/opt/apps/env/lib/python3.6/site-packages/starlette/middleware/errors.py", line 159, in __call__
    await self.app(scope, receive, _send)
  File "/opt/apps/env/lib/python3.6/site-packages/starlette/exceptions.py", line 82, in __call__
    raise exc from None
  File "/opt/apps/env/lib/python3.6/site-packages/starlette/exceptions.py", line 71, in __call__
    await self.app(scope, receive, sender)
  File "/opt/apps/env/lib/python3.6/site-packages/starlette/routing.py", line 550, in __call__
    await route.handle(scope, receive, send)
  File "/opt/apps/env/lib/python3.6/site-packages/starlette/routing.py", line 227, in handle
    await self.app(scope, receive, send)
  File "/opt/apps/env/lib/python3.6/site-packages/starlette/routing.py", line 41, in app
    response = await func(request)
  File "/opt/apps/env/lib/python3.6/site-packages/fastapi/routing.py", line 197, in app
    dependant=dependant, values=values, is_coroutine=is_coroutine
  File "/opt/apps/env/lib/python3.6/site-packages/fastapi/routing.py", line 149, in run_endpoint_function
    return await run_in_threadpool(dependant.call, **values)
  File "/opt/apps/env/lib/python3.6/site-packages/starlette/concurrency.py", line 34, in run_in_threadpool
    return await loop.run_in_executor(None, func, *args)
  File "/usr/lib/python3.6/concurrent/futures/thread.py", line 56, in run
    result = self.fn(*self.args, **self.kwargs)
  File "./main.py", line 155, in API_call
    raise e
  File "./main.py", line 129, in API_call
    model = pickle.load(open('models/' + current_model, 'rb'))
  File "/opt/apps/env/lib/python3.6/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/opt/apps/env/lib/python3.6/site-packages/dill/_dill.py", line 473, in load
    obj = StockUnpickler.load(self)
  File "/opt/apps/env/lib/python3.6/site-packages/dill/_dill.py", line 463, in find_class
    return StockUnpickler.find_class(self, module, name)
AttributeError: Can't get attribute 'Model_II_b' on <module '__mp_main__' from '/opt/apps/env/bin/uvicorn'>
enter code here
  • Are you using docker? Also, does the code come from two separate files? Do you mind sharing the folder and file structure? – lsabi Jul 17 '20 at 20:24
  • @lsabi No docker involved and it is in the same file. the above code is exactly the code from the file. It runs with python directly and delivers a forecast from the model when called via the fastapi webserver. that is why i am really clueless at the moment. regarding the structure the model files are in a subfolder 'models/'. – Sebastian Steinfort Jul 20 '20 at 05:36
  • I see. Then, do you mind posting a little bit more of the stack trace? It's difficult to say from just the module __mp_main__ – lsabi Jul 20 '20 at 08:40
  • @lsabi I added the stacktrace to my post. Hope you find a hint! – Sebastian Steinfort Jul 20 '20 at 11:26
  • Could this be helpful? https://stackoverflow.com/questions/27732354/unable-to-load-files-using-pickle-and-multiple-modules – lsabi Jul 20 '20 at 13:35
  • @lsabi thank you for the hint. The customer unpickler solved my problem! – Sebastian Steinfort Jul 21 '20 at 05:58
  • I have the same issue but the custom unpickler route didn't solve my problem. I use `torch.load` to load my model and the model definition is right there, above that line. Is this an issue caused by uvicorn? – Hamman Samuel Jun 30 '21 at 06:40

1 Answers1

2

With the help from @lsabi I found the solution here https://stackoverflow.com/a/51397373/13947506

With the custom unpickler my problem was solved:

class CustomUnpickler(pickle.Unpickler):

    def find_class(self, module, name):
        if name == 'Model_II_b':
            from model_ii_b import Model_II_b
            return Model_II_b
        return super().find_class(module, name)

current_model = 'model_v2b_c2_small_ep24.pkl'

model = CustomUnpickler(open('models/' + current_model, 'rb')).load()