0

I've read the tutorials: http://fabricjs.com/articles and the docs about Fabric Objects.

I was able to load JPG and PNG images, but in my project I need to load TIFF images onto the canvas and be able to apply filters on it. I'm able to render TIFF images using the canvas context, but whenever 'renderAll()' is called it clears the context and my TIFF image is cleared. Also I cannot perform other operations like zoom, pan, brightness and contrast since I can't render it.

Can someone please help me understand how I can convert a TIFF image into a Fabric Object so that I can do all standard fabric.Object related operations on it.

Here are the steps I followed:

  1. To load a mock TIFF image I'm reading it as an arraybuffer.

    public loadMockTiffImage() {
    // Create a new XMLHttpRequest object to read the mock TIFF image as ArrayBuffer
    const xhr = new XMLHttpRequest();
    
    // Configure it: GET-request for the URL
    xhr.open('GET', 'assets/tif/sample.tif', true);
    xhr.responseType = 'arraybuffer';
    xhr.timeout = 10000; // timeout in ms, 10 seconds
    
    // Send the request over the network
    xhr.send();
    
    // After the response is received, load it
    xhr.onload = () => {
      // analyze HTTP status of the response
      if (xhr.status !== 200) {
        // throw error incase status is not 200
        console.log(`Error ${xhr.status}: ${xhr.statusText}`);
      } else {
        // Show the result
        console.log(`Done, got ${xhr.response.byteLength} bytes`);
        console.log(xhr.response);
        // Add to canvas the XHR response which is of type ArrayBuffer
        this.addTiffImageOnCanvas(xhr.response);
      }
    };
    
    // Show progress of loading the bytes
    xhr.onprogress = event => {
      if (event.lengthComputable) {
        console.log(`Received ${event.loaded} of ${event.total} bytes`);
      } else {
        console.log(`Received ${event.loaded} bytes`); // no Content-Length
      }
    };
    
    // Log any network request errors
    xhr.onerror = () => {
      console.log('Request failed!');
    };
    }
    
  2. Next I use UTIF.js to decode the ArrayBuffer and convert it to ImageBitmap so that I can use canvas.drawImage() to render it on the canvas. How do I convert this ImageBitmap/ArrayBuffer to a FabricJS object?

    private addTiffImageOnCanvas(buffer: ArrayBuffer) {
    // Using UTIF.js to decode the array buffer and convert it to ImageBitmap
    const ifds = UTIF.decode(buffer);
    UTIF.decodeImage(buffer, ifds[0]);
    const timage = ifds[0];
    const array = new Uint8ClampedArray(UTIF.toRGBA8(timage));
    // Forming image Data
    const imageData = new ImageData(array, timage.width, timage.height);
    let ibm: ImageBitmap = null;
    const bmPromise: Promise<ImageBitmap> = createImageBitmap(imageData);
    
    bmPromise.then(bitmap => {
      ibm = bitmap;
      fabric.Image.fromObject(ibm, image => {
         // TODO: How or What should I do now?
      });
    });
    

    }

Thank you for your help.

Zeeshan S.
  • 2,041
  • 2
  • 21
  • 40
  • Use [fabric.Image.fromURL](http://fabricjs.com/docs/fabric.Image.html#.fromURL). https://stackoverflow.com/a/36975511/3551786 – Durga Oct 03 '19 at 12:33

1 Answers1

2

I didn't find any mention of something able to handle ImageBitmap in fabric's github repo.

However, you can very well create a Fabric.Image from an HTMLCanvasElement. So you would have to draw this ImageBitmap on a canvas, and if we're going to use a canvas anyway, better to do it at the previous step, when you get an ImageData:

var scene = new fabric.Canvas('fabric');
scene.setHeight(500);
scene.setWidth(500);

fetch( "https://upload.wikimedia.org/wikipedia/commons/d/d8/Example.tiff" )
  .then( (resp) => resp.arrayBuffer() )
  .then( makeFabricImageFromTiff )
  .then( (fabricImage) => scene.add( fabricImage ) );

function makeFabricImageFromTiff( buffer ) {
  // Using UTIF.js to decode the array buffer and convert it to ImageData
  const ifds = UTIF.decode( buffer );
  UTIF.decodeImage( buffer, ifds[ 0 ] );
  const timage = ifds[ 0 ];
  const array = new Uint8ClampedArray( UTIF.toRGBA8( timage ) );
  // Forming image Data
  const imageData = new ImageData( array, timage.width, timage.height );
  // a temporary canvas element
  const canvas = document.createElement( 'canvas' );
  canvas.width = timage.width;
  canvas.height = timage.height;
  // on which we draw the ImageData
  canvas.getContext( '2d' )
    .putImageData( imageData, 0, 0 );
  // before converting it to a Fabric.Image instance
  return new fabric.Image( canvas, { left: 50, top: 50 } );
}
canvas { border: 1px solid; }
<script src="https://cdn.jsdelivr.net/gh/photopea/UTIF.js/UTIF.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/fabric.js/2.4.4/fabric.min.js"></script>
<canvas id="fabric"></canvas>
Kaiido
  • 123,334
  • 13
  • 219
  • 285
  • Thanks a lot! So it works greate on the 2d context where we have the putImageData, but when I use it with webgl, should I be using drawPixels? What's the alternative when using webgl? – Zeeshan S. Oct 11 '19 at 17:04
  • Why would you use webgl here? The only operation you'll do on this canvas is to put the ImageData on it. 2d context will be as fast as webgl or even faster (webgl contexts tend to be slower to initialise). – Kaiido Oct 11 '19 at 23:38
  • I have 8MB TIFF images to render on the canvas. In some situations I will have multiple (upto 9) TIFF images to render on the same canvas with different opacity overlayed on top of each other. My request for webgl was to figure out if there are any performance issues between the 2d and webgl implementations. Thanks. – Zeeshan S. Oct 14 '19 at 16:10