I have a Mac (not iOS) application that allows the user to select one or more images with NSOpenPanel. What I have trouble with is how to get correct dimensions for multiple-layered images. If an image contains one layer or compressed, the following will get me correct image dimensions with the file path.
NSImage *image0 = [[NSImage alloc] initWithContentsOfFile:path];
CGFloat w = image0.size.width;
CGFloat h = image0.size.height;
But if I select an image that has multiple layers, I'll get strange numbers. For example, I have a single-layer image whose dimensions are 1,440 x 900 px according to Fireworks. If I add a small layer of a circle and save an image as PNG and read it, I get 1,458 x 911 px. According to this topic and this topic, they suggest that I read the largest layer. Okay. So I've created a function as follows.
- (CGSize)getimageSize :(NSString *)filepath {
NSArray * imageReps = [NSBitmapImageRep imageRepsWithContentsOfFile:filepath];
NSInteger width = 0;
NSInteger height = 0;
for (NSImageRep * imageRep in imageReps) {
if ([imageRep pixelsWide] > width) width = [imageRep pixelsWide];
if ([imageRep pixelsHigh] > height) height = [imageRep pixelsHigh];
}
NSSize size = CGSizeMake((CGFloat)width, (CGFloat)height);
return size;
}
Using the function above, I get wrong dimensions (1,458 x 911 px) instead of 1,440 x 900 px. Actually, I had the same problem when I was developing Mac applications with REAL Stupid till a few years ago. So how can I get correct dimensions when an image contains multiple layers?
Thank you for your advice.