-1

I get the error - RangeError: attempting to construct out-of-bounds TypedArray on ArrayBuffer when I try to load a tf-js model into Reactjs.

I'm using express.js to send the json+bin files to react so that I can run inference in the browser itself.

Here's the relevant Express.js code. The json+bin files are all in the same folder.

app.use(
  "/api/pokeml/classify",
  express.static(path.join(__dirname, "classifier_models/original/model.json"))
)

Here's how I'm loading this in React -

import * as tf from "@tensorflow/tfjs"

  useEffect(() => {
    async function fetchModel() {
      // const mobileNetModel = await mobilenet.load()
      // const classifierModel = await axios.get("api/pokeml/classify")
      const classifierModel = await tf.loadLayersModel(
        "http://localhost:3001/api/pokeml/classify"
      )
      setModel(classifierModel)
    }
    fetchModel()
  }, [])

The model.json file loads correctly but the shards do not - enter image description here

theairbend3r
  • 679
  • 9
  • 26
  • have you generated the tf-js model yourself? – iamaatoh Jun 23 '20 at 07:14
  • Yes, I used tensorflowjs_converter to convert the python tf model to js. – theairbend3r Jun 23 '20 at 07:23
  • 1
    can you try loading a standard model from https://github.com/tensorflow/tfjs-models, and not a converted one? check if that loads ok? if that works, it might be because of a malformed model.json because of the converter – iamaatoh Jun 23 '20 at 07:25
  • I can't seem to find the json+bin files for the models in the repo you posted. Can you please send the link to it? Or do you want to to load a model directly as `import * as mobilenet from "@tensorflow-models/mobilenet"`? If latter is the case, then yes, import a model from the package works. Am I sending the files from Express correctly? Because it could either be Malformed JSON/binary files or it could be I'm not properly sending the files from Expressjs. – theairbend3r Jun 23 '20 at 07:33
  • IMO you seem to be sending the files ok - the `model.json` itself is not 404, just the shards - which are being loaded through `model.json`. yes, i meant direct import. if that works - it might be something i faced sometime back, documented it here: https://github.com/akshatamohanty/client-ai-template/blob/master/README.md#how-to-convert-an-ai-model-to-deploy-in-a-web-app – iamaatoh Jun 23 '20 at 07:53
  • i'd also try accessing and confirming the shard paths, which show 404 directly - to confirm though – iamaatoh Jun 23 '20 at 07:57
  • I saw the repo you've attached. It said the issue could be either be a bug during conversion or it could be because a particular model cannot be converted. I used a mobilenet fine-tuned on my dataset if it helps. Can you maybe send a (working) converted model (json + shards) so that I can test it? or maybe do you know where I can find it online? I want to make sure that there's no error with express before I debug the tfjs model. – theairbend3r Jun 23 '20 at 13:24
  • I am able to access single shard files by serving a single binary file as - `app.use("/api/pokeml/classify", express.static(path.join(__dirname, "classifier_models/original/group1-shard1of3.bin")))` and accessed it in reactjs like `const classifierModel = await axios.get("api/pokeml/classify")`. Is this what you meant? – theairbend3r Jun 23 '20 at 13:34

1 Answers1

1

I re-created your problem in a sandbox. Server | Client

The shards are being sourced from the same endpoint that you used to call the model, so once you have the model.json, it goes back to fetch api/pokeml/group1-shard2of2.bin which is currently not accessible.

So,

app.use(
  "/api/pokeml/classify",
  express.static(path.join(__dirname, "classifier_models/original/model.json"))
);

// add this, 
// to allow access to the `original` folder from `api/pokeml` for 
// the shards to be accessible
app.use(
  "/api/pokeml",
  express.static(path.join(__dirname, "classifier_models/original"))
);

If this still doesn't solve your problem, you could try replacing the model/shards in the sandbox. That would allow you to detect if your model is malformed.

iamaatoh
  • 758
  • 5
  • 12
  • Thank you so much! :D The model loads perfectly after fixing the express.js paths. When I try to make a prediction on a sample image, I get the error `Error: Provided weight data has no target variable: Conv1_5/kernel` which I think is a file corruption. I'll retrain and convert a new model to test it out. – theairbend3r Jun 23 '20 at 15:15