I need to create a movie from a series of generated images. (I'm creating the images based on the output of a physics modeling program.)
I found Apple's sample in QtKitCreateMovie and used that as a starting point. Instead of loading jpgs from the application bundle, I'm drawing to an NSImage and then adding that NSImage to the movie object. Here's the basic code I used for testing. mMovie
is an instance of QTMovie
:
NSImage *anImage = [[NSImage alloc] initWithSize:NSMakeSize(frameSize, frameSize)];
[anImage lockFocus];
float blendValue;
for (blendValue = 0.0; blendValue <= 1.0; blendValue += 0.05) {
[[[NSColor blueColor] blendedColorWithFraction:blendValue ofColor:[NSColor redColor]] setFill];
[NSBezierPath fillRect:NSMakeRect(0, 0, frameSize, frameSize)];
[mMovie addImage:anImage forDuration:duration withAttributes:myDict];
}
[anImage unlockFocus];
[anImage release];
This works under OS X 10.5, but under OS X 10.6 I get an array index beyond bounds exception on the call to addImage:forDuration:withAttributes
: (http://openradar.appspot.com/radar?id=1146401)
What's the proper way to create a movie under 10.6?
Also, although this works under 10.5, I run out of memory if I try to create a movie with thousands of frames. That also makes me think I'm not using the correct approach.