7

Is there a compression algorithm that is faster than JPEG yet well supported? I know about jpeg2000 but from what I've heard it's not really that much faster.

Edit: for compressing.

Edit2: It should run on Linux 32 bit and ideally it should be in C or C++.

Richard Knop
  • 81,041
  • 149
  • 392
  • 552
  • 1
    for decompressing or compressing? – tenfour Dec 29 '10 at 16:57
  • Just curious, why do the images need to be compressed? And by how much? – Mark Ransom Dec 29 '10 at 21:24
  • @Mark Ransom: Well, I need them compressed to send them from small humanoid robot with 500Mhz CPU and 256MB RAM over UDP to a pc for processing. I need to get to at least 20 images per second and the wifi stick is not fast enough to send that much data over 1 second so I am using JPEG to decrease the bandwidth. – Richard Knop Dec 29 '10 at 21:32
  • 2
    A video codec would be more appropriate than managing individual full frames. – Steve-o May 09 '13 at 20:43

6 Answers6

4

Jpeg encoding and decoding should be extremely fast. You'll have a hard time finding a faster algorithm. If it's slow, your problem is probably not the format but a bad implementation of the encoder. Try the encoder from libavcodec in the ffmpeg project.

R.. GitHub STOP HELPING ICE
  • 208,859
  • 35
  • 376
  • 711
  • JPEG encoding is designed for fast decoding. This does not always mean that it has fast encoding as well (in fact, many times it is much slower to encode). – Zac Howland Dec 29 '10 at 19:14
  • Both are extremely fast if you're not striving for the optimal encoding. A low-end x86 from within the last few years should be able to encode jpeg at a rate of 30 megapixels per second or better (rough estimate off the top of my head). – R.. GitHub STOP HELPING ICE Dec 29 '10 at 19:28
  • An encoder meant for video encoding is bound to be optimized for speed. I know that MJPEG has been plenty fast for years, although I always thought it took specialized hardware to achieve that. – Mark Ransom Dec 29 '10 at 19:37
  • Well, I am using OpenCV to encode raw images to jpeg on a robot with 500Mhz CPU and 256MB RAM. It is taking 0.25s to encode one 640*480 RGB image now which is not acceptable. I need 20+ images per second. – Richard Knop Dec 29 '10 at 21:19
  • I think OpenCV is your problem. Even my old K6 at 450 MHz could encode 640x480 JPEG at 25-30 fps. Of course I did have YUV source rather than RGB. If there's any way you could arrange for the source images to be YUV, that would help a lot. If not, make sure you're using a fast conversion routine. `libswscale` from `ffmpeg` is the fastest I know of. – R.. GitHub STOP HELPING ICE Dec 29 '10 at 21:26
  • The problem is I don't really have a possibility to mess with the embedded linux on the robot. I don't think I will be allowed to install ffmpeg there. Do you think if I get the camera to return YUV422 images (it is supposed to from the manual) the OpenCV will encode them into jpeg faster? – Richard Knop Dec 29 '10 at 21:30
  • In principle it should be considerably faster. The only problem would be if OpenCV is really stupid and converts them to RGB and back to YUV for no reason. – R.. GitHub STOP HELPING ICE Dec 29 '10 at 22:14
  • 2
    Well, I'm hoping to get to 0.05s for encoding a 640*480 image (YUV422) which would mean 20 fps. I hope it's realistic on 500Mhz CPU. – Richard Knop Dec 29 '10 at 22:17
3

Do you have MMX/SSE2 instructions available on your target architecture? If so, you might try libjpeg-turbo. Alternatively, can you compress the images with something like zlib and then offload the actual reduction to another machine? Is it imperative that actual lossy compression of the images take place on the embedded device itself?

Wyatt Anderson
  • 9,612
  • 1
  • 22
  • 25
2

In what context? On a PC or a portable device?

From my experience you've got JPEG, JPEG2000, PNG, and ... uh, that's about it for "well-supported" image types in a broad context (lossy or not!)

(Hooray that GIF is on its way out.)

evilspoons
  • 405
  • 2
  • 6
  • 16
  • I'd go so far as to say JPEG2000 isn't universal, so the list is really down to just JPEG and PNG. – Jonathan Grynspan Dec 29 '10 at 16:56
  • The patents on LZW have expired at least in parts of Europe, so there's no real reason avoiding GIF except for it's limited colorspace. And that can be circumvented (rather ugly though). – onemasse Dec 29 '10 at 17:00
  • It's for an embedded linux robot. – Richard Knop Dec 29 '10 at 17:01
  • tiff is still around somehow, I seem to keep running into it with scanners. also non-lossy. – old_timer Dec 29 '10 at 17:48
  • DCT-compressed images can be put in a TIFF container, so technically TIFF can be either lossy or non-. Doesn't change the baseline observation that DCT is just about the only game in town for lossy image compression, though. – zwol Dec 29 '10 at 17:51
2

JPEG2000 isn't faster at all. Is it encoding or decoding that's not fast enough with jpeg? You could probably be alot faster by doing only 4x4 FDCT and IDCT on jpeg.

It's hard to find any documentation on IJG libjpeg, but if you use that, try lowering the quality setting, it might make it faster, also there seems to be a fast FDCT option.

Someone mentioned libjpeg-turbo that uses SIMD instructions and is compatible with the regular libjpeg. If that's an option for you, I think you should try it.

onemasse
  • 6,514
  • 8
  • 32
  • 37
1

I think wavelet-based compression algorithms are in general slower than the ones using DCT. Maybe you should take a look at the JPEG XR and WebP formats.

Tamás
  • 298
  • 3
  • 14
1

You could simply resize the image to a smaller one if you don't require the full image fidelity. Averaging every 2x2 block into a single pixel will reduce the size to 1/4 very quickly.

Mark Ransom
  • 299,747
  • 42
  • 398
  • 622
  • Unless you write some extremely optimized code to do the downscaling, performing the jpeg compression with `libavcodec` will probably take less time than your downscaling code. – R.. GitHub STOP HELPING ICE Dec 29 '10 at 17:45
  • @R, isn't the algorithm I suggested capable of being extremely optimized very easily? – Mark Ransom Dec 29 '10 at 17:59
  • Yes if you write it in asm, but I doubt a pure C implementation of that downscaling algorithm can beat libavcodec's jpeg encoder, at least not with current compiler technology. – R.. GitHub STOP HELPING ICE Dec 29 '10 at 19:31
  • @R, interesting. I might have to try it for myself. – Mark Ransom Dec 29 '10 at 19:35
  • You need (low pass) filtering prior to downsampling, and that will be the expensive part. (Averaging is a very poor low pass filter.) – Paul R Dec 30 '10 at 08:16
  • @Paul R, Averaging might not be the best low pass filter, but I would hardly call it "very poor". Other methods would certainly be much slower. – Mark Ransom Dec 30 '10 at 15:04
  • @Mark Ransom: using averaging will typically leave visible artefacts due to significant energy above Nyquist. If you don't care about image quality and visible artefacts then it might be good enough, but for a computer vision application it might cause problems with things like edge detection. – Paul R Dec 30 '10 at 20:30