2

i am grabbing PIL images with screengrab from the screen, saving them in a queue and writing them into a jpg image sequence.

I use a producer thread to capture and a worker to write the images down to disk.

However i noticed, that this queue gets really large really fast, even though the written output is not really that large, when compressen with jpg. That leads to the grabs being put into extended memory on the disk, making the write process even slower. Since my data comes in bursts, i can use up some time to write to the disk, but if the memory is written to disk, it gets just too slow.

Is there a way to compress the images before adding them into the queue?

cheers,

tarrasch
  • 2,630
  • 8
  • 37
  • 61

1 Answers1

0

Here's an idea, merge the images as they come in.

After a set period of time or set amount merged, compress the image. Divide the image back into separate ones.

/profit

A T
  • 13,008
  • 21
  • 97
  • 158
  • unfortunately that does not reduce the amount of memory they use. – tarrasch Jan 23 '12 at 07:32
  • You will only have on image in memory. When the size gets to a certain threshold store it in a file. When the amount of filenames stored in your queue reaches a certain number, start processing them. Alternatively when idle time reaches a certain number, start processing them. – A T Jan 23 '12 at 12:10