I am developing an app in ionic3 framework for recognizing drawn characters and I am having trouble with the import of the model. I have imported the model from Keras (converted with tensorflowjs_converter
) into my ionic3 app in two different ways:
- The
model.json
and weight files (shards) are placed into the folder/assets/models
. - The
model.json
and weight files (shards) are being hosted in firebase storage.
When launching the app in the browser with the first method, the model and weights are correctly loaded and I am able to predict the classes. But when launching the app, with the same method into my Android device with ionic cordova run android --device
, the model seems to not retrieve the data from the weight files as it gives the following error:
Based on the provided shape, [3, 3, 32, 64], the tensor should have 18432 values but has 917
.
Now, I tried to host the files in firebase storage to try and fix this issue. I retrieve the model.json
from storage and I still get the same error as stated above in both browser and device.
From the experience of storing the shards and model locally in the app, I came to the conclusion that the shards are not being recognized in the device either both ways.
Also, when using the firebase storage method in device, when trying to fetch the model from the url, I catch the following error: Failed to fetch
.
Here is the code of retrieving the shards and the model:
const modelURL: string = await this.db.getModel();
const shards: string[] = await this.db.getShards();
modelURL
and shards
contains the download urls from firebase storage. The model and the shards are kept together on the same level:
/* Firebase Storage hierarchy */
https://firebasestorage.googleapis.com/v0/b/project-foo.com/o/model%2Fmodel.json?alt=media&token=******
https://firebasestorage.googleapis.com/v0/b/project-foo.com/o/model%2Fgroup1-shard1of4?alt=media&token=******
https://firebasestorage.googleapis.com/v0/b/project-foo.com/o/model%2Fgroup1-shard2of4?alt=media&token=******
https://firebasestorage.googleapis.com/v0/b/project-foo.com/o/model%2Fgroup1-shard3of4?alt=media&token=******
https://firebasestorage.googleapis.com/v0/b/project-foo.com/o/model%2Fgroup1-shard4of4?alt=media&token=******
So with that, I pass the download url of the model into tf.loadModel
:
import * as tf from '@tensorflow/tfjs';
await tf.loadModel(modelURL).then(model => {
const output: any = model.predict(img);
});
So, is there any way to pass the shards into tf.loadModel()
, fetched from firebase storage, so that in my device and browser I can retrieve all of the data required in order to predict from the model?
Thank you for your help.