I've got a method for masking a B&W image by cutting out (i.e. making transparent) any pixels that are above or below a certain brightness. The result would be the same B&W image, but with everything above 70% or below 25% brightness (or whatever you choose) changed to transparent.
It was working perfectly on iOS 11, but it broke on iOS 12. It now returns the original, solid image with no modifications every time.
-(UIImage*)imageWithLumaMaskFromDark:(CGFloat)lumaFloor toLight:(CGFloat)lumaCeil {
// inputs range from 0 - 255
CGImageRef rawImageRef = self.CGImage;
const CGFloat colorMasking[6] = {lumaFloor, lumaCeil, lumaFloor, lumaCeil, lumaFloor, lumaCeil};
UIGraphicsBeginImageContext(self.size);
CGImageRef maskedImageRef = CGImageCreateWithMaskingColors(rawImageRef, colorMasking);
{
//if in iphone
CGContextTranslateCTM(UIGraphicsGetCurrentContext(), 0.0, self.size.height);
CGContextScaleCTM(UIGraphicsGetCurrentContext(), 1.0, -1.0);
}
CGContextDrawImage(UIGraphicsGetCurrentContext(), CGRectMake(0, 0, self.size.width, self.size.height), maskedImageRef);
UIImage *result = UIGraphicsGetImageFromCurrentImageContext();
CGImageRelease(maskedImageRef);
UIGraphicsEndImageContext();
return result;
}
I'm an experienced iOS dev but a complete CGImage/CGContext noob. Can anyone help me figure out what could have broken with this method in iOS 12, and what I can do to fix it?