1

enter image description here

I am trying to do dispatch_async to the drawing code I posted on https://stackoverflow.com/questions/34430468/while-profiling-with-instruments-i-see-a-lot-of-cpu-consuming-task-happening-w. I got an error of : "No matching function for call to 'dispatch_async' . What I am trying to do , as this is a memory expensive operation , trying to create queue for the rendering operation to happen in background and when the image is ready to put in the main queue because UI update process works in the main thread. So guys guide me on this thread . I am posting the whole code.

#pragma mark Blurring the image
- (UIImage *)blurWithCoreImage:(UIImage *)sourceImage
{
    // Set up output context.
    dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0);
    dispatch_async(queue, ^{
        CIImage *inputImage = [CIImage imageWithCGImage:sourceImage.CGImage];

        // Apply Affine-Clamp filter to stretch the image so that it does not
        // look shrunken when gaussian blur is applied
        CGAffineTransform transform = CGAffineTransformIdentity;
        CIFilter *clampFilter = [CIFilter filterWithName:@"CIAffineClamp"];
        [clampFilter setValue:inputImage forKey:@"inputImage"];
        [clampFilter setValue:[NSValue valueWithBytes:&transform objCType:@encode(CGAffineTransform)] forKey:@"inputTransform"];

        // Apply gaussian blur filter with radius of 30
        CIFilter *gaussianBlurFilter = [CIFilter filterWithName: @"CIGaussianBlur"];
        [gaussianBlurFilter setValue:clampFilter.outputImage forKey: @"inputImage"];
        [gaussianBlurFilter setValue:@10 forKey:@"inputRadius"]; //30

        CIContext *context = [CIContext contextWithOptions:nil];
        CGImageRef cgImage = [context createCGImage:gaussianBlurFilter.outputImage fromRect:[inputImage extent]];
        UIGraphicsBeginImageContext(self.view.frame.size);
        CGContextRef outputContext = UIGraphicsGetCurrentContext();

        // Invert image coordinates
        CGContextScaleCTM(outputContext, 1.0, -1.0);
        CGContextTranslateCTM(outputContext, 0, -self.view.frame.size.height);

        // Draw base image.
        CGContextDrawImage(outputContext, self.view.frame, cgImage);

        // Apply white tint
        CGContextSaveGState(outputContext);
        CGContextSetFillColorWithColor(outputContext, [UIColor colorWithWhite:1 alpha:0.2].CGColor);
        CGContextFillRect(outputContext, self.view.frame);
        CGContextRestoreGState(outputContext);

        dispatch_async(dispatch_get_main_queue(), ^{
            UIImage *outputImage = UIGraphicsGetImageFromCurrentImageContext();
            UIGraphicsEndImageContext();

            return outputImage;
        })

    });



// Output image is ready.


}

It is throwing error on this code dispatch_async(dispatch_get_main_queue(), i.e when I am trying to bring it back on the main thread, for the UI works in main thread. What I am missing?

Community
  • 1
  • 1
Kumar Utsav
  • 2,761
  • 4
  • 24
  • 38
  • 1
    You should not call this kind of procedure from the main thread in the first place. A better solution is to encapsulate the code that consumes the result of this procedure as a continuation, or a completion block in Objective-C term. Asynchronously dispatched codes never return to the context it got fired from, unless extra synchronization methods being taken to prevent that thread from continuing, which is not desirable in this case, though. – ZhangChn Dec 23 '15 at 11:02
  • You are missing a semicolon at the end of your `dispatch_async` call, e.g. `dispatch_async(dispatch_get_main_queue(), ^{ ... });`. – Rob Dec 23 '15 at 11:31

2 Answers2

1

See this answer to a similar question:

Is this Core Graphics code thread safe?

You start drawing on one thread, then finish it on another thread. That's a ticking time bomb.

In addition, the "return outputImage" performed on the main thread isn't going to do you any good, because there is nobody to receive that return value. You should do all your drawing in the same thread, extract the image, and then call something on the main thread that processes the complete image.

Community
  • 1
  • 1
gnasher729
  • 51,477
  • 5
  • 75
  • 98
  • You were right. I did the code asynchronously only to find the blur effect was gone. Now that you have said the same thread, is it possible to fix the same thread other than the main thread, just asking before I give up the hope of optimizing these line of codes. – Kumar Utsav Dec 23 '15 at 13:43
  • "That's a ticking time bomb." No, core graphics is generally thread safe as no data is shared concurrently between threads. However, it is wrong to design the API as the OP did. The result should be passed to a continuation/completion block rather than direct return, which does not exist in this case. – ZhangChn Dec 24 '15 at 13:27
0

I think your code looks good but the way you are using may wrong. So please try like bellow

Create one method like bellow

- (UIImage *)blurWithCoreImage:(UIImage *)sourceImage
{


    // Set up output context.
//    dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0);
//    dispatch_async(queue, ^{
        CIImage *inputImage = [CIImage imageWithCGImage:sourceImage.CGImage];

        // Apply Affine-Clamp filter to stretch the image so that it does not
        // look shrunken when gaussian blur is applied
        CGAffineTransform transform = CGAffineTransformIdentity;
        CIFilter *clampFilter = [CIFilter filterWithName:@"CIAffineClamp"];
        [clampFilter setValue:inputImage forKey:@"inputImage"];
        [clampFilter setValue:[NSValue valueWithBytes:&transform objCType:@encode(CGAffineTransform)] forKey:@"inputTransform"];

        // Apply gaussian blur filter with radius of 30
        CIFilter *gaussianBlurFilter = [CIFilter filterWithName: @"CIGaussianBlur"];
        [gaussianBlurFilter setValue:clampFilter.outputImage forKey: @"inputImage"];
        [gaussianBlurFilter setValue:@10 forKey:@"inputRadius"]; //30

        CIContext *context = [CIContext contextWithOptions:nil];
        CGImageRef cgImage = [context createCGImage:gaussianBlurFilter.outputImage fromRect:[inputImage extent]];
        UIGraphicsBeginImageContext(self.view.frame.size);
        CGContextRef outputContext = UIGraphicsGetCurrentContext();

        // Invert image coordinates
        CGContextScaleCTM(outputContext, 1.0, -1.0);
        CGContextTranslateCTM(outputContext, 0, -self.view.frame.size.height);

        // Draw base image.
        CGContextDrawImage(outputContext, self.view.frame, cgImage);

        // Apply white tint
        CGContextSaveGState(outputContext);
        CGContextSetFillColorWithColor(outputContext, [UIColor colorWithWhite:1 alpha:0.2].CGColor);
        CGContextFillRect(outputContext, self.view.frame);
        CGContextRestoreGState(outputContext);

        UIImage *outputImage = UIGraphicsGetImageFromCurrentImageContext();
        UIGraphicsEndImageContext();
        return outputImage;
}

and use this method like bellow

dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0);
        dispatch_async(queue, ^{
            UIImage *img = [self blurWithCoreImage:[UIImage imageNamed:@"imagename.png"]];
            dispatch_async(dispatch_get_main_queue(), ^{
                [self.view addSubview:[[UIImageView alloc] initWithImage:img]];
            });
        });

I just tried like bellow for testing , it gave me proper result. so have a try

Result of above code

Let me know if you face any issues , all the best

Govindarao Kondala
  • 2,862
  • 17
  • 27
  • The grammar of the code becomes correct when you put __block to the image you want to mutate inside the block. But then the end result is hopeless, as blurry effect doesn't happen. – Kumar Utsav Dec 23 '15 at 13:45
  • The grammar of the syntax works perfectly fine . Alas , the functionality is useless. Allow me to explain. The blurriness comes after 2 to 3 second, which the user or me never wants. It has to be done on the main thread, lesson learned. – Kumar Utsav Dec 23 '15 at 13:58