4

I believe the NDA is down, so I can ask this question. I have a UIView subclass:

BlurView *blurredView = ((BlurView *)[self.view snapshotViewAfterScreenUpdates:NO]);
blurredView.frame = self.view.frame;
[self.view addSubview:blurredView];

It does its job so far in capturing the screen, but now I want to blur that view. How exactly do I go about this? From what I've read I need to capture the current contents of the view (context?!) and convert it to CIImage (no?) and then apply a CIGaussianBlur to it and draw it back on the view.

How exactly do I do that?

P.S. The view is not animated, so it should be OK performance wise.

EDIT: Here is what I have so far. The problem is that I can't capture the snapshot to a UIImage, I get a black screen. But if I add the view as a subview directly, I can see the snapshot is there.

// Snapshot
UIView *view = [self.view snapshotViewAfterScreenUpdates:NO];

// Convert to UIImage
UIGraphicsBeginImageContextWithOptions(view.bounds.size, view.opaque, 0.0);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();

// Apply the UIImage to a UIImageView
BlurView *blurredView = [[BlurView alloc] initWithFrame:CGRectMake(0, 0, 500, 500)];
[self.view addSubview:blurredView];
blurredView.imageView.image = img;

// Black screen -.-

BlurView.m:

- (id)initWithFrame:(CGRect)frame {
    self = [super initWithFrame:frame];

    if (self) {
        // Initialization code
        self.imageView = [[UIImageView alloc] init];
        self.imageView.frame = CGRectMake(20, 20, 200, 200);
        [self addSubview:self.imageView];
    }
    return self;
}
Nikolay Dyankov
  • 6,491
  • 11
  • 58
  • 79

4 Answers4

13

Half of this question didn't get answered, so I thought it worth adding.

The problem with UIScreen's

- (UIView *)snapshotViewAfterScreenUpdates:(BOOL)afterUpdates

and UIView's

- (UIView *)resizableSnapshotViewFromRect:(CGRect)rect 
                      afterScreenUpdates:(BOOL)afterUpdates 
                           withCapInsets:(UIEdgeInsets)capInsets

Is that you can't derive a UIImage from them - the 'black screen' problem.

In iOS7 Apple provides a third piece of API for extracting UIImages, a method on UIView

- (BOOL)drawViewHierarchyInRect:(CGRect)rect 
             afterScreenUpdates:(BOOL)afterUpdates  

It is not as fast as snapshotView, but not bad compared to renderInContext (in the example provided by Apple it is five times faster than renderInContext and three times slower than snapshotView)

Example use:

 UIGraphicsBeginImageContextWithOptions(image.size, NULL, 0);
 [view drawViewHierarchyInRect:rect];
 UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
 UIGraphicsEndImageContext();  

Then to get a blurred version

 UIImage* lightImage = [newImage applyLightEffect];  

where applyLightEffect is one of those Blur category methods on Apple's UIImage+ImageEffects category mentioned in the accepted answer (the enticing link to this code sample in the accepted answer doesn't work, but this one will get you to the right page: the file you want is iOS_UIImageEffects).

The main reference is to WWDC2013 session 226, Implementing Engaging UI on iOS

By the way, there is an intriguing note in Apple's reference docs for renderInContext that hints at the black screen problem..

Important: The OS X v10.5 implementation of this method does not support the entire Core Animation composition model. QCCompositionLayer, CAOpenGLLayer, and QTMovieLayer layers are not rendered. Additionally, layers that use 3D transforms are not rendered, nor are layers that specify backgroundFilters, filters, compositingFilter, or a mask values. Future versions of OS X may add support for rendering these layers and properties.

The note hasn't been updated since 10.5, so I guess 'future versions' may still be a while off, and we can add our new CASnapshotLayer (or whatever) to the list.

foundry
  • 31,615
  • 9
  • 90
  • 125
  • I wasn't really getting anything blurry with this method. You get a much nicer effect by using UIVisualEffect: https://stackoverflow.com/a/40016528/35690 – Senseful Jul 12 '18 at 17:33
9

Sample Code from WWDC ios_uiimageeffects

There is a UIImage category named UIImage+ImageEffects

Here is its API:

- (UIImage *)applyLightEffect;
- (UIImage *)applyExtraLightEffect;
- (UIImage *)applyDarkEffect;
- (UIImage *)applyTintEffectWithColor:(UIColor *)tintColor;

- (UIImage *)applyBlurWithRadius:(CGFloat)blurRadius 
                       tintColor:(UIColor *)tintColor 
           saturationDeltaFactor:(CGFloat)saturationDeltaFactor 
                       maskImage:(UIImage *)maskImage;

For legal reason I can't show the implementation here, there is a demo project in it. should be pretty easy to get start with.

Alex Cio
  • 6,014
  • 5
  • 44
  • 74
Kyle Fang
  • 1,139
  • 6
  • 14
  • 3
    That answers only half of my question. I still need to get a UIImage to apply those effects to, and this method gives me a black image: http://stackoverflow.com/questions/4334233/how-to-capture-uiview-to-uiimage-without-loss-of-quality-on-retina-display – Nikolay Dyankov Sep 22 '13 at 09:38
  • If you only want the image, there is no need for the snapshot API. just render the image directly on `self.view`? – Kyle Fang Sep 22 '13 at 09:49
  • So where do I get the image if I don't snapshot the screen? – Nikolay Dyankov Sep 22 '13 at 09:51
  • `UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, self.view.opaque, 0.0);` then `[self.view.layer renderInContext:UIGraphicsGetCurrentContext]` – Kyle Fang Sep 22 '13 at 09:52
  • @NikolayDyankov Have you find a solution for this black image? – m8labs Oct 14 '13 at 16:29
  • 1
    @NikolayDyankov I don't think this will ever work. I think the reason this is not working (black screen) is that the view returned by snapshot is actually NOT a full fledged view. If you step through the debugger, you will see that it's type is '_UIReplicantView'. This makes me think that it is just an optimized copy of the other view and doesn't have the same 'layer' characteristics of a 'full' UIView... – MobileVet Nov 26 '13 at 16:28
4

To summarize how to do this with foundry's sample code, use the following:

I wanted to blur the entire screen just slightly so for my purposes so I'll use the main screen bounds.

CGRect screenCaptureRect = [UIScreen mainScreen].bounds;
UIView *viewWhereYouWantToScreenCapture = [[UIApplication sharedApplication] keyWindow];

//screen capture code
UIGraphicsBeginImageContextWithOptions(screenCaptureRect.size, NO, [UIScreen mainScreen].scale);
[viewWhereYouWantToScreenCapture drawViewHierarchyInRect:screenCaptureRect afterScreenUpdates:NO];
UIImage *capturedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();

//blur code
UIColor *tintColor = [UIColor colorWithWhite:1.0 alpha:0];
UIImage *blurredImage = [capturedImage applyBlurWithRadius:1.5 tintColor:tintColor saturationDeltaFactor:1.2 maskImage:nil];
//or use [capturedImage applyLightAffect] but I thought that was too much for me

//use blurredImage in whatever way you so desire!

Notes on the screen capture part

UIGraphicsBeginImageContextWithOptions() 2nd argument is opacity. It should be NO unless you have nothing with any alpha other than 1. If you return yes the screen capture will not look at transparency values so it will go faster but will probably be wrong.

UIGraphicsBeginImageContextWithOptions() 3rd argument is the scale. Probably want to put in the scale of the device like I did to make sure and differentiate between retina and non-retina. But I haven't really tested this and I think 0.0f also works.

drawViewHierarchyInRect:afterScreenUpdates: watch out what you return for the screen updates BOOL. I tried to do this right before backgrounding and if I didn't put NO the app would go crazy with glitches when I returned to the foreground. You might be able to get away with YES though if you're not leaving the app.

Notes on blurring

I have a very light blur here. Changing the blurRadius will make it blurrier, and you can change the tint color and alpha to make all sorts of other effects.

Also you need to add a category for the blur methods to work...

How to add the UIImage+ImageEffects category

You need to download the category UIImage+ImageEffects for the blur to work. Download it here after logging in: https://developer.apple.com/downloads/index.action?name=WWDC%202013

Search for "UIImageEffects" and you'll find it. Just pull out the 2 necessary files and add them to your project. UIImage+ImageEffects.h and UIImage+ImageEffects.m.

Also, I had to Enable Modules in my build settings because I had a project that wasn't created with xCode 5. To do this go to your target build settings and search for "modules" and make sure that "Enable Modules" and "Link Frameworks Automatically" are both set to yes or you'll have compiler errors with the new category.

Good luck blurring!

teradyl
  • 2,584
  • 1
  • 25
  • 34
0

Check WWDC 2013 sample application "running with a snap".

The blurring is there implemented as a category.

Sulthan
  • 128,090
  • 22
  • 218
  • 270