2

I'm currently working on a project, where I have to deal with realy big images (>> 100mb). The images are in the format of a raw byte array (for now only grayscale images, later color image would have a byte array for each channel).

I want to show the image in a JavaFX imageView. So I have to convert the given "raw" image data to a JavaFX image in as less time as possible.

I've tried a lot of solutions, which I found here and stackoverflow and from other sources.

SwingFXUtils

The most popular (and easy) solution is construct a BufferedImage from the raw Data and convert it to a JavaFX Image using SwingFXUtils.toFXImage(...). On an image with around 100mb (8184*12000) I measured the following times.

  1. The BufferedImage is created in ca. 20-40ms.
  2. The conversion to a JavaFX Image via SwingFXUtils.toFXImage(...) takes more than 700ms. Which is too much for my needs.

Encode image and read it as a ByteArrayInputStream

One approach I've found here (https://stackoverflow.com/a/33605064/3237961) is to use OpenCVs functionality to encode the image to a format like *.bmp and construct a JavaFX Image directly from the ByteArray.

This solution is way more complex (OpenCV or some other encoding library/algorithm is needed) and encoding seems to add additional computation steps.

Question

So I'm looking for a more efficient way of doing this. Mostly all solutions use the SwingFXUtils in the end, or solve the problem by iterating over all pixels to convert them (which is the slowest possible solution). Is there a way, to either implement a more efficient function than SwingFXUtils.toFXImage(...) or construct a JavaFX image directly from the byte array. Maybe there is also a way, to draw a BufferedImage directly in JavaFX. Because IMHO the JavaFX Image doesn't bring any advantages and it only makes things complicated.

Thanks for your replys.

seenukarthi
  • 8,241
  • 10
  • 47
  • 68
Jakob
  • 113
  • 1
  • 10
  • 2
    Have a look at the [`PixelBuffer`](https://openjfx.io/javadoc/14/javafx.graphics/javafx/scene/image/PixelBuffer.html), which you can pass to a `WritableImage`. – James_D Sep 29 '20 at 12:49
  • Also see https://stackoverflow.com/questions/60668758/how-to-show-images-in-a-large-frequency-in-javafx/60672547#60672547 – James_D Sep 29 '20 at 12:59
  • PixelBuffer and PixelWrite seem interesting. But how can I convert a single channel grayscale image (i.e. byte array of length = width*height), when I only can choose the PixelFormat "ByteBgraPreInstance" which has three channels (as I understand) – Jakob Sep 29 '20 at 13:08
  • @Jakob: In grayscale images, all three channels have the same value, so single channel color x becomes the color (x,x,x). – Konrad Höffner Sep 29 '20 at 13:12
  • @KonradHöffner Yeah, but is this done by the pixel writer function automatically? I don't want to modify each pixel of my image and add the extra channels. Additionally I would have to care about the alpha channel, which my image does not have. When I construct the PixelBuffer from my Bytebuffer (Which is a single channel 8bit image) than I get an Error: java.lang.IllegalArgumentException: Insufficient memory allocated for ByteBuffer – Jakob Sep 29 '20 at 13:21
  • @James_D Unfortunatly I don't know how to do this. Konrad suggested me to use byte indexed pixelformat (I think he meant PixelFormat.createByteIndexedInstance(...). But I'm not very familiar with pixel formats. I've only worked with on channel per color and not with packed pixelformats) – Jakob Sep 29 '20 at 14:02
  • Actually, I checked and you can't subclass `PixelFormat` (because the only constructor is private). Is it really prohibitive just to create a new array with all channels? But otherwise just call `PixelFormat.createByteIndexedInstance(...)` and pass in an array of ints representing the argb values for your grayscale colors. You only need to do that once and can reuse the reulsting pixel format. – James_D Sep 29 '20 at 14:16

2 Answers2

3

Following up the suggestion by @James_D in the comments and your requirement of a grayscale image:

Use javafx.scene.image.PixelBuffer but specify the javafx.scene.image.PixelFormat as BYTE_INDEXED.

Set up the palette for color i to be RGB (i,i,i) or RGBA (i,i,i,1.0) using PixelFormat.createByteIndexedInstance() or PixelFormat.createByteIndexedPremultipliedInstance().

Then you should be able to just feed in your one-channel array of byte grayscale values.

Konrad Höffner
  • 11,100
  • 16
  • 60
  • 118
  • Thank you for your answer. Can you point out (or suggest me any useful link) on how to create the palette for my color array? The packed color format in a single integer is completly new to my. As I understand I need to loop over all grayscale values (0 to 255) and fill my three components of the value at current color arry position with the corresponding grayscale value. How can I access the desired component (i.e. bytes) in my integer? – Jakob Sep 29 '20 at 13:55
  • For PixelBuffer the PixelBuffer BYTE_INDEXED is unsupported – Jakob Sep 29 '20 at 14:13
  • 1
    @Jakob If `BYTE_INDEXED` is unsupported by `PixelBuffer`, you can create a `WritableImage` and write to it using `getPixelWriter().setPixels(...)` instead. – James_D Sep 29 '20 at 14:25
3

Distilling the information from the comments, there appear to be two viable options for this, both using a WritableImage.

In one, you can use the PixelWriter to set the pixels in the image, using the original byte data and a BYTE_INDEXED PixelFormat. The following demos this approach. Generating the actual byte array data here takes ~2.5 seconds on my system; creating the (big) WritableImage takes about 0.15 seconds, drawing the data into the image about 0.12 seconds.

import java.nio.ByteBuffer;

import javafx.application.Application;
import javafx.scene.Scene;
import javafx.scene.image.ImageView;
import javafx.scene.image.PixelFormat;
import javafx.scene.image.WritableImage;
import javafx.scene.layout.BorderPane;
import javafx.stage.Stage;


public class App extends Application {

    @Override
    public void start(Stage stage) {
       int width = 12000 ;
       int height = 8184 ;
       byte[] data = new byte[width*height];

       int[] colors = new int[256];
       for (int i = 0 ; i < colors.length ; i++) {
           colors[i] = (255<<24) | (i << 16) | (i << 8) | i ;
       }
       PixelFormat<ByteBuffer> format = PixelFormat.createByteIndexedInstance(colors);

       
       long start = System.nanoTime();
       
       for (int y = 0 ; y < height ; y++) {
           for (int x = 0 ; x < width; x++) {
               long dist2 = (1L * x - width/2) * (x- width/2) + (y - height/2) * (y-height/2);
               double dist = Math.sqrt(dist2);
               double val = (1 + Math.cos(Math.PI * dist / 1000)) / 2;
               data[x + y * width] = (byte)(val * 255);
           }
       }
       
       long imageDataCreated = System.nanoTime();
       
       
       WritableImage img = new WritableImage(width, height);
       
       long imageCreated = System.nanoTime();
       
       img.getPixelWriter().setPixels(0, 0, width, height, format, data, 0, width);
       
       long imageDrawn = System.nanoTime() ;
       
       ImageView imageView = new ImageView();
       imageView.setPreserveRatio(true);
       imageView.setImage(img);
       
       long imageViewCreated = System.nanoTime();
       
       BorderPane root = new BorderPane(imageView);
       imageView.fitWidthProperty().bind(root.widthProperty());
       imageView.fitHeightProperty().bind(root.heightProperty());
       Scene scene = new Scene(root, 800, 800);
       stage.setScene(scene);
       stage.show();
       
       long stageShowCalled = System.nanoTime();
       
       double nanosPerMilli = 1_000_000.0 ;
       
       System.out.printf(
               "Data creation time: %.3f%n"
            + "Image Creation Time: %.3f%n"
            + "Image Drawing Time: %.3f%n"
            + "ImageView Creation Time: %.3f%n"
            + "Stage Show Time: %.3f%n", 
            (imageDataCreated-start)/nanosPerMilli,
            (imageCreated-imageDataCreated)/nanosPerMilli,
            (imageDrawn-imageCreated)/nanosPerMilli,
            (imageViewCreated-imageDrawn)/nanosPerMilli,
            (stageShowCalled-imageViewCreated)/nanosPerMilli);
    }

    public static void main(String[] args) {
        launch();
    }

}

The (crude) profiling on my system gives

Data creation time: 2414.017
Image Creation Time: 157.013
Image Drawing Time: 122.539
ImageView Creation Time: 15.626
Stage Show Time: 132.433

The other approach is to use a PixelBuffer. It appears PixelBuffer does not support indexed colors, so here there is no option but to convert the byte array data to array data representing ARGB values. Here I use a ByteBuffer where the rgb values are repeated as bytes, and the alpha is always set to 0xff:

import java.nio.ByteBuffer;

import javafx.application.Application;
import javafx.scene.Scene;
import javafx.scene.image.ImageView;
import javafx.scene.image.PixelBuffer;
import javafx.scene.image.PixelFormat;
import javafx.scene.image.WritableImage;
import javafx.scene.layout.BorderPane;
import javafx.stage.Stage;


public class App extends Application {

    @Override
    public void start(Stage stage) {
       int width = 12000 ;
       int height = 8184 ;
       byte[] data = new byte[width*height];

       
       PixelFormat<ByteBuffer> format = PixelFormat.getByteBgraPreInstance();

       
       long start = System.nanoTime();
       
       for (int y = 0 ; y < height ; y++) {
           for (int x = 0 ; x < width; x++) {
               long dist2 = (1L * x - width/2) * (x- width/2) + (y - height/2) * (y-height/2);
               double dist = Math.sqrt(dist2);
               double val = (1 + Math.cos(Math.PI * dist / 1000)) / 2;
               data[x + y * width] = (byte)(val * 255);
           }
       }
       
       long imageDataCreated = System.nanoTime();
       
       byte alpha = (byte) 0xff ;
       
       byte[] convertedData = new byte[4*data.length];
       for (int i = 0 ; i < data.length ; i++) {
           convertedData[4*i] = convertedData[4*i+1] = convertedData[4*i+2] = data[i] ;
           convertedData[4*i+3] = alpha ;
       }
       long imageDataConverted = System.nanoTime() ;
       ByteBuffer buffer = ByteBuffer.wrap(convertedData);
       
       WritableImage img = new WritableImage(new PixelBuffer<ByteBuffer>(width, height, buffer, format));
       
       long imageCreated = System.nanoTime();
       
       
       ImageView imageView = new ImageView();
       imageView.setPreserveRatio(true);
       imageView.setImage(img);
       
       long imageViewCreated = System.nanoTime();
       
       BorderPane root = new BorderPane(imageView);
       imageView.fitWidthProperty().bind(root.widthProperty());
       imageView.fitHeightProperty().bind(root.heightProperty());
       Scene scene = new Scene(root, 800, 800);
       stage.setScene(scene);
       stage.show();
       
       long stageShowCalled = System.nanoTime();
       
       double nanosPerMilli = 1_000_000.0 ;
       
       System.out.printf(
               "Data creation time: %.3f%n"
            + "Data Conversion Time: %.3f%n"
            + "Image Creation Time: %.3f%n"
            + "ImageView Creation Time: %.3f%n"
            + "Stage Show Time: %.3f%n", 
            (imageDataCreated-start)/nanosPerMilli,
            (imageDataConverted-imageDataCreated)/nanosPerMilli,
            (imageCreated-imageDataConverted)/nanosPerMilli,
            (imageViewCreated-imageCreated)/nanosPerMilli,
            (stageShowCalled-imageViewCreated)/nanosPerMilli);
    }

    public static void main(String[] args) {
        launch();
    }

}

The timings for this are pretty similar:

Data creation time: 2870.022
Data Conversion Time: 273.861
Image Creation Time: 4.381
ImageView Creation Time: 15.043
Stage Show Time: 130.475

Obviously this approach, as written, consumes more memory. There may be a way to create a custom implementation of ByteBuffer that simply looks into the underlying byte array and generates the correct values without the redundant data storage. Depending on your exact use case, this may be more efficient (if you can reuse the converted data, for example).

enter image description here

James_D
  • 201,275
  • 16
  • 291
  • 322
  • Thank you! Yesterday evening I figured out how to implement your first approach. But you answered it faster than me (And your answer is also a lot more detailed, than my would have been. So Thank you!). I'll accept your answer and will maybe come back, when struggeling with color images :D But it is a littlebit frustrating, that using JavaFX Image still consumes more time, than using the BufferedImage directly. – Jakob Sep 30 '20 at 07:21
  • 1
    @Jakob I am wondering though if such a simplistic brute force approach is the right way to go. People who deal with really large images normally use other techniques to deal with the enormous amount of data. Normally you would build an image pyramid and use image tiling for this. Something that should make you think is whether you have a screen that could display 8184*12000 pixels at the same time. – mipa Sep 30 '20 at 09:06