16

I have converted a keras model to tensorflow json format and saved it locally in my computer. I am trying to load that json model in a javascript code using the below command

model = await tf.loadModel('web_model')

But the model is not getting loaded. Is there a way to load tensorflow json model from local file system?

user2693313
  • 341
  • 3
  • 5
  • 13
  • 1
    I guess you are not serving from a dev server? Using your browser to just open the html file will result in problems in the xhr requests used to fetch the file. Maybe try out https://www.npmjs.com/package/http-server – Max Dec 05 '18 at 20:05
  • I have just started exploring tensorflow js and using my brower to test things – user2693313 Dec 05 '18 at 20:06

11 Answers11

42

I know you're trying to load your model in a browser but if anybody lands here that's trying to do it in Node, here's how:

const tf = require("@tensorflow/tfjs");
const tfn = require("@tensorflow/tfjs-node");
const handler = tfn.io.fileSystem("./path/to/your/model.json");
const model = await tf.loadLayersModel(handler);
Lynnlo
  • 105
  • 1
  • 8
jafaircl
  • 917
  • 7
  • 6
  • This is fantastic. Thank you. – user11145590 Feb 09 '22 at 21:27
  • `const tf = require("@tensorflow/tfjs-node"); const handler = tf.io.fileSystem("./path/to/your/model.json"); const model = await tf.loadLayersModel(handler);` Import of tfjs is not required if you import tfjs-node – tadvas May 10 '23 at 20:42
14

LoadModel uses fetch under the hood. And fetch cannot access the local files directly. It is meant to be used to get files served by a server. More on this here. To load a local file with the browser, there is two approaches, asking the user to upload the file with

<input type="file"/>

Or serving the file by a server.

In these two scenarios, tf.js provides way to load the model.

  1. Load the model by asking the user to upload the file

html

<input type="file" id="upload-json"/>
<input type="file" id="upload-weights"/>

js

const uploadJSONInput = document.getElementById('upload-json');
const uploadWeightsInput = document.getElementById('upload-weights');
const model = await tfl.loadModel(tf.io.browserFiles(
 [uploadJSONInput.files[0], uploadWeightsInput.files[0]]));
  1. Serving the local files using a server

To do so, one can use the following npm module http-server to serve the directory containing both the weight and the model. It can be installed with the following command:

 npm install http-server -g

Inside the directory, one can run the following command to launch the server:

http-server -c1 --cors .

Now the model can be loaded:

 // load model in js script
 (async () => {
   ...
   const model = await tf.loadFrozenModel('http://localhost:8080/model.pb', 'http://localhost:8080/weights.json')
 })()
edkeveked
  • 17,989
  • 10
  • 55
  • 93
  • Thanks a lot for the help. But I am getting the below error while trying to load the files in model Uncaught (in promise) TypeError: Failed to execute 'readAsText' on 'FileReader': parameter 1 is not of type 'Blob'. – user2693313 Dec 11 '18 at 07:48
  • Maybe, you have to read the file after it has been loaded using an event listener – edkeveked Dec 14 '18 at 20:44
  • using the http server worked for me – Maheen Saleh Oct 31 '21 at 09:11
2
const tf = require('@tensorflow/tfjs');
const tfnode = require('@tensorflow/tfjs-node');

async function loadModel(){
    const handler = tfnode.io.fileSystem('tfjs_model/model.json');
    const model = await tf.loadLayersModel(handler);
    console.log("Model loaded")
}


loadModel();

This worked for me in node. Thanks to jafaircl.

Manash Mandal
  • 421
  • 2
  • 5
2

If you're using React with create-react-app, you can keep your saved model files in your public folder.

For example, say you want to use the blazeface model. You would

  1. Download the .tar.gz model from that web page.

  2. Unpack the model into your app's public directory. So now you have the files from the .tar.gz file in a public subdir:

    %YOUR_APP%/public/blazeface_1_default_1/model.json
    %YOUR_APP%/public/blazeface_1_default_1/group1-shard1of1.bin
    
  3. Load the model in your React app using

    tf.loadGraphModel(process.env.PUBLIC_URL + 'blazeface_1_default_1/model.json'
    
1

You could try:

const model = await tf.models.modelFromJSON(myModelJSON)

Here it is in the tensorflow.org docs

1

i found a solution that it works. You can replace the url with a localhost url on xampp, for example (directory = model) http://localhost/model/model.json and after that you have to disable your browser CORS policy. For me i found a chrome extention and removed cors for my specific tab and it worked.

Thank me later!!

EvilTwin
  • 31
  • 3
0

Check out our documentation for loading models: https://js.tensorflow.org/api/latest/#Models-Loading

You can use tf.loadModel takes a string which is a URL to your model definition which needs to get served over HTTP. This means you need to start an http-server to serve those files (it will not allow you to make a request to your filesystem because of CORS).

This package can do that for you: npmjs.com/package/http-server

Nikhil
  • 485
  • 1
  • 4
  • 16
0

You could use insecure chrome instance:

C:\Program Files (x86)\Google\Chrome\Application>chrome.exe --disable-web-security --disable-gpu --user-data-dir=C:/Temp

Than you could add this script to redefine fetch function

async function fetch(url) {
  return new Promise(function(resolve, reject) {
    var xhr = new XMLHttpRequest
    xhr.onload = function() {
      resolve(new Response(xhr.responseText, {status: 200}))
    }
    xhr.onerror = function() {
      reject(new TypeError('Local request failed'))
    }
    xhr.open('GET', url)
    xhr.send(null)
  })
}

After that be shure that you use the right model loader my comment about loader issue

BUT your weights will be incorrect - as I understand there are some encoding problems.

Mahalov Ivan
  • 91
  • 1
  • 2
0

If you are trying to load it in server side, use @tensorflow/tfjs-node instead of @tensorflow/tfjs and update to 0.2.1 or higher version to resolve this issue.

Nima
  • 265
  • 3
  • 10
0

I am using React js for loading model (for image classification and more machine learning stuff)

Tensorflow.js do not support an Api to read a previously model trained

    const file= new Blob()
    file.src=modelJSON
    const files= new Blob()
    files.src=modelWeights
    console.log(files)
    const model= await tf.loadLayersModel(tf.io.browserFiles([file, files]));

[![enter image description here][1]][1]

You be able to create an APi in Express.js for servering your model (model.json and weigths.bin) if you use a web app (for a tensorflow.lite you could use a opencv.readTensorflowmodel(model.pb, weight.pbtxt)

References: How to load tensorflow-js weights from express using tf.loadLayersModel()?

     const classifierModel = await tf.loadLayersModel(            
            "https://rp5u7.sse.codesandbox.io/api/pokeml/classify"
        ); 
        const im = new Image()
            im.src =imagenSample//'../../../../../Models/ShapesClassification/Samples/images (2).png';
        const abc= this.preprocessImage(im);
const preds = await classifierModel.predict(abc)//.argMax(-1);
            console.log('<Response>',preds,'Principal',preds.shape[0],'DATA',preds.dataSync())
            const responde=[...preds.dataSync()]
            console.log('Maxmimo Valor',Math.max.apply(Math, responde.map(function(o) { return o; })))
            let indiceMax = this.indexOfMax(responde)
            console.log(indiceMax)
            console.log('<<<LABEL>>>',this.labelsReturn(indiceMax))
0

If you are using Django, you should:

  1. create a directory static in your app and put your model there.

  2. load that static directory to the template where you want to use your model:

    var modelPath = "{% static 'sampleModel.json' %}">
    

Don't forget to also load tensorflow.js library:

<script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs"></script>
  1. Now you can load your model:

    <script>model = await tf.loadGraphModel(modelPath)</script>
    
Sayyor Y
  • 1,130
  • 2
  • 14
  • 27