1

This is how I blur a UIImage called artworkImage using UIImage+Effects to get the iOS 7 blur effect:

-(void)viewDidAppear:(BOOL)animated{

    MPMediaItem *currentItem = [self.musicPlayer nowPlayingItem];

    dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0ul), ^(void) {
        @autoreleasepool {

            MPMediaItemArtwork *artwork = [currentItem valueForProperty: MPMediaItemPropertyArtwork];
            UIImage *artworkImage = [artwork imageWithSize: CGSizeMake (618, 618)];

            artworkImage = [artworkImage applyDarkEffect];

            dispatch_async(dispatch_get_main_queue(), ^{

                [backgroundImageView setImage:artworkImage];

                UIGraphicsEndImageContext();
            });
        }
    });
}

My app is really slow so I researched how to find out why, and I came across Instruments, which showed me this:

450MB?!

So I researched some more how to solve this, and came across dispatch_async, so I put the actual blurring into the background and updating the UI in the front. It's still terribly slow.

This UIImage called artworkImage updates every time the music player skips song. I apply the iOS 7 blur effect from Apple's sample projects, called UIImage+Effects.h, to this UIImage.

Please advise me on what to do - I've searched countless threads which all say use autorelease, which I can't of course use with ARC.

Any help would be much appreciated, thanks.

David Berry
  • 40,941
  • 12
  • 84
  • 95
WunDaii
  • 2,322
  • 4
  • 18
  • 26
  • 3
    Running something in a separate thread with `dispatch_async` will not reduce how much RAM it uses. – sudo May 22 '14 at 14:37
  • Does the memory usage stay at 450MB after it's done changing the image? Knowing that might help indicate what is causing it. And how is the CPU usage? I'm also assuming you've tried this with none of the dispatch_async stuff, but if you haven't, I would try that. – sudo May 22 '14 at 14:45
  • This sentence makes no sense: "It's still terribly slow (450MB)". "Slow" is a speed. "450MB" is a size. – matt May 22 '14 at 14:45
  • Using "DISPATCH_QUEUE_PRIORITY_DEFAULT" will certainly slow things down, because you are saying "Do this in the background whenever you feel like it." – matt May 22 '14 at 14:46
  • @matt I might be wrong, but I don't think that should make a difference in this case unless there's some other thread hogging a lot of CPU cycles. It also would not explain the massive RAM usage. – sudo May 22 '14 at 14:47
  • But (1) He is not asking about the RAM usage (he is complaining about "slow"), and (2) the RAM usage has nothing to do with the code he is displaying, so we have no info whatever. – matt May 22 '14 at 14:48
  • @9000 I tried it without the dispatch and the RAM usage is still really high - http://i.imgur.com/1igBqEP.png – WunDaii May 22 '14 at 14:53
  • @matt I apologize, I thought the high RAM usage is what is making my app slow – WunDaii May 22 '14 at 14:53
  • When I skip a few tracks, my app just crashes. – WunDaii May 22 '14 at 14:55
  • Here are two images without dispatch of Activity Monitor and Time Profiler - http://imgur.com/nB8dxYV,p9gfMbL - and here is the same **with** dispatch - http://imgur.com/JXAsV8U,7xmrH28 – WunDaii May 22 '14 at 15:09
  • @user3127576 The high RAM usage causes a slowdown if you exhaust the system memory, but matt was just saying that RAM usage is not a measure of speed per se. – sudo May 22 '14 at 21:34

2 Answers2

3

What's most likely happening here is that you're building up a series of UIImages from the above processing, causing your application to eventually exhaust memory and crash.

Apple's UIImage+Effects category does blurring CPU-side using a series of box blurs. It's not the fastest process in the world. In addition to that, creating a UIImage can be slow, as is setting a UIImage to a UIImageView for display. Almost all of that is done on the CPU, rather than the GPU.

Therefore, if you trigger the above method every time you update a UI element, you will be dispatching a series of blocks first to the default background queue, then asynchronously to the main queue. The latter is most likely where your memory buildup is occurring, since every block there will retain the objects within it. If your processing and UIImageView updating takes longer than the interval at which the above method is triggered, you will build up blocks in your dispatch queues and each of those blocks will have UIImages within them. This will quickly cause memory buildup and a crash.

There are two ways to approach this. First, you could use dispatch semaphores to make sure that only one block is every running on your background queues. I describe a process that I use for this here. This would prevent the memory accumulation, but might not help with your blurring speed.

For that, you could look at an alternative to the blurring category Apple provides. My open source GPUImage framework has a GPU-accelerated blur that replicates Apple's default effect. If you need the darker variant of that, I talk a bit about how to modify that for this here. Going to and from UIImages is still slow, but by piping the blur through to a GPUImageView instead of a UIImageView, you'll at least replace the latter half of the slow processing you have here. Filtering directly to a GPUImageView for display keeps all of that on the GPU.

Community
  • 1
  • 1
Brad Larson
  • 170,088
  • 45
  • 397
  • 571
  • Thanks for the help. I tried to implement your GPUImage - http://i.imgur.com/Oh77xQo.png - but the image isn't blurred. Also, how would I use a dispatch semaphores with my method above? I tried numerous ways but I don't think I quite understand. – WunDaii May 22 '14 at 18:39
  • @user3127576 - You used a sepia tone filter in the above code, so of course it wasn't going to blur the image. You need to use something like the above-mentioned GPUImageiOSBlurFilter or a standard Gaussian. I provide some example code in my semaphore answer, which you can apply here. It should be generally applicable. If you don't understand what's going on there, take a gander at Mike Ash's great GCD post here: https://www.mikeash.com/pyblog/friday-qa-2009-09-25-gcd-practicum.html – Brad Larson May 22 '14 at 18:43
  • D'oh! Thanks, the blur works perfectly now, thanks :) I'm just struggling on how to make it darker? I found GPUImageLuminanceRangeFilter in GPUImageiOSBlurFilter.m but I'm not sure what to change – WunDaii May 22 '14 at 19:29
  • @user3127576 - Look in the kGPUImageLuminanceRangeFragmentShaderString string constant, which is where the fragment shader for that operation is defined. The math is in a C-like language, and you'll need to tweak that until you get the effect you want. It shouldn't be too hard to iterate until you get it to line up with the exact coloring you need, then copy that code over to a custom darkened version of this filter that you can use without hacking at the framework code itself. Several people have done this, but no one's yet sent it back as a pull request. – Brad Larson May 22 '14 at 20:31
  • I don't understand why the UIImages would use 450MB of RAM *after* they have been altered. There's probably a memory leak somehow. – sudo May 22 '14 at 21:29
  • @9000 - A new UIImage instance is created every time the above method is run. That UIImage is then retained by the block until it has a chance to run. If blocks take longer to process than the method is being called, these UIImages will build up in memory. An uncompressed UIImage is at least length*width*4 bytes in size, so you can hit 450 MB of these hanging around pretty quickly if the queue is jammed up with blocks. – Brad Larson May 22 '14 at 21:32
  • @BradLarson I understand. So the problem is that the blurring takes too long, not that there's a memory leak. Though the user should still not be allowed to scroll more quickly than the app can process the images. – sudo May 22 '14 at 21:35
  • @9000 - That's where a dispatch semaphore can come in handy, making sure that this only updates when the previous processing is done. Frame rate will then vary based on the device's ability to process the image. – Brad Larson May 22 '14 at 21:37
  • @BradLarson @ 9000 - Thanks for clearing up what's going on. I'll read about dispatch semaphores and check out the darkening of GPUImage. Thanks a lot guys, I really appreciate it :) – WunDaii May 22 '14 at 21:42
0

i'm using Stackblur

https://github.com/tomsoft1/StackBluriOS

very easy to use, and good optimisation.

user3206558
  • 392
  • 1
  • 12
  • It's still extremely laggy and sometimes it won't even load. Also, Apple's UIImage+Effects dark the image at the same time, which is what I want :) – WunDaii May 22 '14 at 17:56