0

The UIVisualEffectView blur styles do a LOT of blurring. I want to control the amount of blur. How do I do that?

(Note that this is a "ringer" question, to provide a new answer to this question, with updated information. Most of the other questions and answers are quite old, and the answers have real limitations.)

Duncan C
  • 128,072
  • 22
  • 173
  • 272
  • Does this answer your question? [How to set the BlurRadius of UIBlurEffectStyle.Light](https://stackoverflow.com/questions/25529500/how-to-set-the-blurradius-of-uiblureffectstyle-light) – Eugene Dudnyk Aug 15 '20 at 21:30
  • Which answer? Altering the alpha of the visual effect view is not recommended by apple and does not create the desired effect. Using UIViewPropertyAnimator to freeze an animation of the view probably works, but that is a hack that does not offer fine control. My answer below lets you create a blur with whatever radius you want. – Duncan C Aug 15 '20 at 22:01
  • That text of my comment was inserted by SO automatically when I made close vote. You could add your answer in that question, to avoid duplication. – Eugene Dudnyk Aug 15 '20 at 22:04
  • @EugeneDudnyk "That text of my comment was inserted by SO automatically" Yes but you can change it, or even delete it. – matt Aug 16 '20 at 17:43
  • I didn't even see this text when I was voting for close from StackExchange iOS app... And I can't edit it now :) – Eugene Dudnyk Aug 16 '20 at 20:06

1 Answers1

0

The best you can do with a UIVisualEffectView set to blur is the new blur styles systemUltraThinMaterial, systemThinMaterial, systemMaterial, systemThickMaterial and systemChromeMaterial (and their light and dark variants.)

The blur amount is the least for the ultra thin material styles, and the most on the thick material styles, but that isn't much control. Sometimes I want a very subtle blur. Even the systemUltraThinMaterial applies a LOT of blur.

Based on another answer to this subject here on SO, I created my own blur view. Rather than creating a view that is placed on top of another view, I created a custom subclass of UIView that blurs it's contents. You put whatever subviews you need inside the view, and it adds a blurred layer on top of the view's contents.

NOTE: As Matt pointed out in the comments, my BlurView class does not provide live blurring. It takes a snapshot of the view's contents and subviews, blurs that, and installs the results as a layer on top of the view's content layer. If you need live blurring of the blurred views this solution is not for you.

Here is a github repo with a working example using my BlurView class:

https://github.com/DuncanMC/BlurView.git

To use it, simply add a the BlurView.swift file to your project.

Then add a UIView to your storyboard and put whatever you want to blur inside it as subviews. (Alternately you could subclass BlurView)

At runtime the BlurView adds a layer on top of the view's content layer that contains a blurred version of the view's contents. The blurLevel property lets you control the radius of the blur effect applied to the image. A value of less than 1 provides a very subtle amount of blur, and the default value of 10 provides a fairly strong amount of blur.

The blur boolean turns the blur effect on and off.

The demo app shows a picture, some text, and a view with a background layer and border color.

Here is what the sample view looks like with blurring turned off:

enter image description here

And what the BlurView looks like with a blur radius of 5 (Which I consider to be a fairly strong amount of blurring: )

enter image description here

For comparison, the lightest blur amount for the new-to-iOS 13 "material" setting on a UIVisualEffectView, "systemUltraThinMaterial", looks like this:

enter image description here

(The ultra thin material blur option seems to use a blur radius of around 30)

As Leo points out in his comment, the adaptive forms of UIVisualEffectView applies a lightening or darkening to the blur that follows the user's light mode or dark mode setting. If that's important to you, you might want to use the UIVisualEffectView.

I suspect it wouldn't be that hard to add a lighten or darken step to my blur view and make it honor the user's light mode/dark mode setting. Core Image has lots of different filters and it' pretty easy to combine them.

Edit:

Based on LeoDabus's comment below, I decided to add an exposure compensation filter to the BlurView. Below is a screenshot of the sample app with that option. The top image is my BlurView with a blur radius of 25 and an EV value of -1, and the bottom is a UIVisualEffectView in systemUltraThinMaterialDark style. I think they look quite close - but my BlurView lets you get an infinite variety of different looks based on your needs.

enter image description here


Edit #2:

After Matt mentioned UIImageEffects in his comments, I found a Swift version of that Apple sample code. It's quite easy to use, and a lot faster than the blur Core Image filter. (Note that the Swift version of UIImageEffects left the default "Copyright <authorname>. All rights reserved." copyright notice, so I would not recommend using it in your projects without getting permission from the author.)

I updated my sample project to have "Use Core Image" UISwitch. When you turn that switch off, it uses UIImageEffects to do the blurring.

The UIImageEffects library has some presets that create looks that approximate light and dark blur modes, but it also includes another of different options. My sample app is currently set up to imitate the light style at the specified blur radius. (It ignores the brightness adjustment setting in UIImageEffects mode.)

The updated version is in a branch called UIImageEffects in the same repo at https://github.com/DuncanMC/BlurView.git

Duncan C
  • 128,072
  • 22
  • 173
  • 272
  • 1
    I prefer the Adaptable Styles because they are dynamic. If your app supports light and dark modes you should always favor those. The same applies to the dynamic colors for labels and backgrounds – Leo Dabus Aug 15 '20 at 16:50
  • @LeoDabus If your only goal is to use the standard UIVisualEffectView, then sure, use those. I find the level of blur to be way too much however. I wanted the flexibility to be able to blur views by variable amounts, and that's what my class does. I added an exposure adjustment option to the BlurView, and it mimics dark mode pretty well I think. The next step would be to give it an adaptive mode where it honored the user's light mode/dark mode setting. – Duncan C Aug 15 '20 at 21:20
  • You need to be explicit about the fact that using a CIFilter is _slow_ (because of the rendering time). Moreover, you are not actually blurring the background; you are taking a _picture_ of something and blurring that. So your blur is not live; Apple's is, so that if what's behind the blur view changes, that change is "seen" through the blur. – matt Aug 16 '20 at 17:41
  • In effect, I would suggest that this question and answer is not a ringer but a cheat. You didn't answer the question. The answer to how you control the amount of blur in a visual effect view is that you don't. All _your_ answer does is tell us how to blur an image, something that has been well covered here. Apple long ago released the UIImageEffects extension that does a blur convolution directly, and is a lot faster than what you're doing. So there's nothing new in your answer. I'm happy you've made a nice extension for yourself but it would be better to link to it from an existing question. – matt Aug 16 '20 at 17:51
  • @matt "A cheat"? Why a cheat? I posted a question about the limitations of Apple's off-the-shelf component for doing blurring, and developed and published an alternative. The answer about how to control the amount of blur in a UIVisualEffectView is with the different material thickness values it now supports, but those offer a very limited ability to vary the amount of blur. I wasn't familiar with the UIImageEffects extension. That does sound a lot faster, and worth exploring. – Duncan C Aug 16 '20 at 22:54
  • 1
    Fair point about my solution not providing live blurring. I'll edit my answer to make that clear. – Duncan C Aug 16 '20 at 22:54
  • @matt, do you know how Apple created UIVisualEffectView so that it serves as a blurring effect that shows the contents underneath? It must somehow composite the views below it, capture them, and display the blurred output into the view. The capturing the composited content of the views/layers below is the bit I'm not clear on. – Duncan C Aug 18 '20 at 02:14
  • They are doing something we are not allowed to do. :( On Mac, you can attach CIFilters to a CALayer to filter what's behind the layer, live. An iOS device doesn't have that kind of power, so those CALayer features are missing. (Well, they are present but inoperative.) See e.g. https://developer.apple.com/documentation/quartzcore/calayer/1410827-backgroundfilters But they made an exception for the blurring views, the system chrome material, etc. I presume they can do that because they can tweak the code for just that exact type of blur so that it can operate live even on an iOS device. – matt Aug 18 '20 at 02:29
  • I know about attaching an array of filters to a layer. It's maddening to read the docs, see the mention of the filters property, get all excited "They finally added it to iOS!" and then see the footnote that it is not supported on iOS. – Duncan C Aug 18 '20 at 11:59
  • I wonder how you'd capture the contents of a rectangular area of your screen (or rather, of your app's view hierarchy) that's under a view. You don't think that's possible with public APIs? – Duncan C Aug 18 '20 at 11:59
  • @matt, you think it wouldn't be possible to duplicate the way UIVisualEffectViews collect the pixels that are visible under them in order to blur/adjust vibrancy? I couldn't think of a way to do it, which is why my solution instead subclasses a UIView that acts as a container the for the content it blurs. – Duncan C Aug 19 '20 at 16:59
  • Well, as I'm sure you're aware, you can render any layer's contents into an graphics context. – matt Aug 19 '20 at 17:09
  • @matt, So I guess I could render the view controller's content view's layer into an offscreen context. Or rather, render the portion of that layer that overlaps with the blur view. I will have to tinker with that. To make it update dynamically, how would you detect and respond to changes in the contents? The brute-force method would be to use a CADisplayLink timer to capture the contents on every screen refresh and look for changes, but that would be quite compute-intensive. – Duncan C Aug 19 '20 at 17:14
  • Well, that's what I said they weren't letting you do. If you could just apply your own compositing filter, your problems would be over; we would just be seeing through the view to whatever is behind it, live, filtered. That is presumably what Apple is doing. But we can't do that. And I think for you to try to work around that is fruitless. – matt Aug 19 '20 at 17:16
  • Although as you point out, the CIGaussianBlur CIFilter is slooooooooooow. It's hard to believe that they could use a CIGaussianBlur filter live. I looked at UIImageEffects, and it's using vImage hardware accelerated graphics, combined with using mulitple passes of a box filter that approximates a Gaussian blur filter, but is much faster. I wonder if they have a private, hidden filter they're using. – Duncan C Aug 19 '20 at 17:21