2

I have a working model using transformers pipeline and I want to use it with FastAPI to send post requisitions and get the response.

The model works this way:

#Loading the model
classifier = pipeline(path_to_model)

#Making predictions:
classifier(txt)

The output is a list of dicts.

My code is:

app = FastAPI()

@app.post("/predictions")
def extract_predictions(text):
    text = text.lower()
    out=classifier(text)
    return {
            "text_message": text, 
            "predictions": out
           }

I can get predictions if I use localhost:8000/docs, but when I use postman or insominia and body(JSON) {"text":"any string"} I get "field_required"

The model takes a string as an input and my postman request uses a JSON body. How can I update model to get the input as JSON?

Chris
  • 18,724
  • 6
  • 46
  • 80
math_guy_shy
  • 161
  • 6
  • You specify a `response_model=Out`, which you don't seem you have defined in your API; hence, the error `name 'Out' is not defined`. Also, please have a look at the answers [here](https://stackoverflow.com/a/71849860/17865804), [here](https://stackoverflow.com/a/71846619/17865804) and [here](https://stackoverflow.com/a/71612734/17865804) on how to perform and return ML predictions. – Chris Jul 22 '22 at 03:09
  • Solved with `Body(..., embed=True)` Thanks Chris. – math_guy_shy Jul 23 '22 at 12:54

1 Answers1

2

The text attribute, as is currently defined, is expected to be a query parameter, not body (JSON) parameter. Hence, you can either send it as a query parameter when POSTing your request, or define it using Body field (e.g., text: str = Body(..., embed=True)), so that is expected in the body of the request as JSON. Please have a look at the answers here and here for more details.

Chris
  • 18,724
  • 6
  • 46
  • 80