1

I have images stored in a database in the form of ImageIcons that I would like to serve to our web page, however for large images I am getting out of memory exceptions.

Here is how I currently do it,

[Edit] I expanded my ImageUtilities to provide a non transparent BufferedImage which simplifies the code,

BufferedImage rgbbi = ImageUtilities.toBufferedImage(icon.getImage());

ServletOutputStream out = null;
try {
    // Get the Servlets output stream.
    out = responseSupplier.get().getOutputStream();

    // write image to our piped stream
    ImageIO.write(rgbbi, "jpg", out);

} catch (IOException e1) {
    logger.severe("Exception writing image: " + e1.getMessage());
} finally {
    try {
        out.close();
    } catch (IOException e) {
        logger.info("Error closing output stream, " + e.getMessage());
    }
}

The exceptions that are being thrown are the following,

Exception in thread "Image Fetcher 0" java.lang.OutOfMemoryError: Java heap space
    at java.awt.image.DataBufferInt.<init>(DataBufferInt.java:41)
    at java.awt.image.Raster.createPackedRaster(Raster.java:458)
    at java.awt.image.DirectColorModel.createCompatibleWritableRaster(DirectColorModel.java:1015)
    at sun.awt.image.ImageRepresentation.createBufferedImage(ImageRepresentation.java:230)
    at sun.awt.image.ImageRepresentation.setPixels(ImageRepresentation.java:484)
    at sun.awt.image.ImageDecoder.setPixels(ImageDecoder.java:120)
    at sun.awt.image.JPEGImageDecoder.sendPixels(JPEGImageDecoder.java:97)
at sun.awt.image.JPEGImageDecoder.readImage(Native Method)
at sun.awt.image.JPEGImageDecoder.produceImage(JPEGImageDecoder.java:119)
at sun.awt.image.InputStreamImageSource.doFetch(InputStreamImageSource.java:246)
at sun.awt.image.ImageFetcher.fetchloop(ImageFetcher.java:172)
at sun.awt.image.ImageFetcher.run(ImageFetcher.java:136)
Exception in thread "Image Fetcher 0" java.lang.OutOfMemoryError: Java heap space
Exception in thread "Image Fetcher 0" java.lang.OutOfMemoryError: Java heap space
Exception in thread "Image Fetcher 0" java.lang.OutOfMemoryError: Java heap space
...

Is there a way I can rewrite this to stream the output of ImageIO.write and limit its buffer size somehow?

[Edit] I can't just increase the heap size either, the images I need to serve are in the range of 10000x7000 pixels, as a byte array that works out (10000px x 7000px x 24bits) 280MB. I think that is an unreasonable heap size to allocate for image conversion in a servlet.

An example Image Large

BalusC
  • 1,082,665
  • 372
  • 3,610
  • 3,555
Andrew
  • 13,757
  • 13
  • 66
  • 84
  • 2
    Sorry for asking the obvious, but have you tried to increase the heap size or to chop the images into smaller chunks? – kostja Sep 22 '11 at 15:23
  • @kostja An example of my images is 10368x6912 pixels large, compressed its 5Mb, uncompressed that ranks in at I think just under 2Gb. Also, How can I chop an ImageIcon? – Andrew Sep 22 '11 at 15:31
  • 1
    So you will be sending 2GB of data from the server? If you have that kind of bandwidth capacity, you can afford more RAM and Java heap. – Andrew Lazarus Sep 22 '11 at 15:57
  • @Andrew I did Bad math :P, 280MB is the correct size. That size refers to the BufferedImage, which is needed to convert to the jpeg that is served after the conversion, which is a manageable size. – Andrew Sep 22 '11 at 16:01
  • 2
    Do you really need to process those images in a servlet? Because it's terrible idea. – Piotr Praszmo Sep 22 '11 at 16:08
  • @Banthar Care to elaborate on "terrible idea"? – Andrew Sep 22 '11 at 16:12
  • @Andrew: Even if you mange to fit in the memory it will be really slow. Single user would be able to clog up your server. – Piotr Praszmo Sep 22 '11 at 16:24

4 Answers4

1

I am assuming you do not have enough pixels on your your screen to display a complete image. As you seem to need an uncompressed version of it in RAM for the display, you will need exactly as much heap as the image size implies. Having said that, there are many better ways.

I wrote my bachelor thesis on efficiently displaying multiple large images with up to 40000x40000 px simultaneously. We ended up implementing an LOD with a multilevel cache. Meaning the image was resized and each size was chopped up into square chunks, resulting in an image pyramid. We had to expariment a bit to find an optimal chunk size. It varies from system to system but may be safely assumed to be somewhere between 64x64 and 256x256 px.

Next thing was to implement a scheduling algorithm for uploading the right chuncks in order to keep the ratio of 1:1 of texel:pixel. To achieve better quality, we used trilinear interpolation between the slices of the pyramid.

The "multilevel" means that image chunks were uploaded to the VRAM of the graphics card with RAM as the L1 cach and the HD as the L2 cache (provided the image is on the network), but this optimisation might be excessive in your case.

All in all, this is lots of things to consider, while you were just asking for memory control. If this is a major project though, implementing an LOD is the right tool for the job.

kostja
  • 60,521
  • 48
  • 179
  • 224
  • The large image is being converted to a jpg Image in the `ImageIO.write` call, which produces the final output which is a reasonable size. It's the raw image that gets created in-between that is the problem. – Andrew Sep 22 '11 at 16:04
1

As pointed out in the comments, to store 10000x7000 images in a database, as ImageIcons, and serve them through a servlet, smells as bad design. Nevertheless, I point out this PNGJ library (disclaimer: I coded it) that allows you read/write images in PNG sequentially, line by line. Of course, this would only be useful if you store your big images in that format.

leonbloy
  • 73,180
  • 20
  • 142
  • 190
  • Thanks for the relevant response. Although not the solution I went with, it answers the question the best I think it can be. There is no built in way of doing this, so 3rd party library or do it yourself seems to be best way. – Andrew Sep 26 '11 at 14:04
  • @leonbloy: Do you know a similar library for writing JPEG files? because of the typical JPEG block size of 16x16 it should be possible to do the same writing 16 lines at once. Anyway PNGJ seems to be a nice library. Coded something similar for writing PNG files my project back in 2009. – Robert Apr 04 '14 at 07:38
0

You're not going to be able to do this using the inbuilt classes like you're using, since they're designed to work on bitmaps wholesale. You might be better off running these out of java via something like Image Magick (or whatever it is these days).

Do you just need to do this once?

You might be stuck having to write all this yourself, loading the file, processing the "pixels" and writing it out. That would be the BEST way to do it, rather than loading the entire thing, converting (i.e. copying) it, and writing it out. I don't know if things like Image Magick work on streams or memory images.

Addenda for AlexR:

To do this PROPERLY, he needs to decode the file in to some streamable format. For example, JPEG divides images in to 8x8 blocks, compresses them individually, then it streams those blocks out. While it is streaming the blocks out, the blocks themselves are compressed (so if you had 10 black blocks, you get like 1 black block with a count of 10).

A raw bit map is little more than blocks of bytes, for high color spaces with alpha, it's 4 bytes (one each for Red, Green, Blue, and Alpha). Most colorspace conversions happen at the pixel level. Other more sophisticated filters work on the pixel and surrounding pixels (Gaussian blur is a simple example).

For simplicity, especially with lots of different formats, it's easier to "load the whole image" in to memory, work on its raw bit map, copy that bit map while converting it, then write the raw image back out in whatever format (say, converting a color JPEG to a Gray Scale PNG).

For large images, as what this person is dealing with, it happens to be VERY expensive with memory.

So, OPTIMALLY, he'd write specific code to read the file in portions, that is stream it in, convert each little bit, and stream it back out again. This would take very little memory, but he'd likely have to do most of the work himself.

So, yes, he can "just read the image byte-by-byte", but the processing and algorithms will likely be rather involved.

Will Hartung
  • 115,893
  • 19
  • 128
  • 203
  • This is not what he need, I think. Pay attention on ` ImageUtilities.toBufferedTransparentImage` - this is the code that causes image to be transparent, so I think he cannot just read image byte-by-byte from file and write it to output stream. – AlexR Sep 22 '11 at 16:08
0

More memory seems to be the only answer for conversion without me having to write my own.

My solution was then to just not convert the images and use the method described in this answer to retrieve the image mime type to be able to set the header.

Community
  • 1
  • 1
Andrew
  • 13,757
  • 13
  • 66
  • 84