5

Since using this function to blur an image, I get frequent crash reports with CoreImage:

// Code exactly as in app
extension UserImage {

    func blurImage(_ radius: CGFloat) -> UIImage? {

        guard let ciImage = CIImage(image: self) else {
            return nil
        }

        let clampedImage = ciImage.clampedToExtent()

        let blurFilter = CIFilter(name: "CIGaussianBlur", parameters: [
            kCIInputImageKey: clampedImage,
            kCIInputRadiusKey: radius])

        var filterImage = blurFilter?.outputImage

        filterImage = filterImage?.cropped(to: ciImage.extent)

        guard let finalImage = filterImage else {
            return nil
        }

        return UIImage(ciImage: finalImage)
    }
}

// Code stripped down, contains more in app
class MyImage {

    var blurredImage: UIImage?

    func setBlurredImage() {
        DispatchQueue.global(qos: DispatchQoS.QoSClass.userInitiated).async {

            let blurredImage = self.getImage().blurImage(100)

            DispatchQueue.main.async {

                guard let blurredImage = blurredImage else { return }

                self.blurredImage = blurredImage
            }
        }
    }
}

According to Crashlytics:

  • the crash happens only for a small percentage of sessions
  • the crash happens on various iOS versions from 11.x to 12.x
  • 0% of the devices were in background state when the crash happened

I was not able to reproduce the crash, the process is:

  1. The MyImageView object (a child of UIImageView) receives a Notification
  2. Sometimes (depending on other logic) a blurred version of a UIImage is created on thread DispatchQueue.global(qos: DispatchQoS.QoSClass.userInitiated).async
  3. On the main thread the objects sets the UIImage with self.image = ...

The app seems to crash after step 3 according to the crash log (UIImageView setImage). On the other hand the crash CIImage in the crash log indicates that the problem is somewhere in step 2 where CIFilter is used to create a blurred version of the image. Note: MyImageView is sometimes used in a UICollectionViewCell.

Crash log:

EXC_BAD_ACCESS KERN_INVALID_ADDRESS 0x0000000000000000

Crashed: com.apple.main-thread
0  CoreImage                      0x1c18128c0 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 2388
1  CoreImage                      0x1c18128c0 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 2388
2  CoreImage                      0x1c18122e8 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 892
3  CoreImage                      0x1c18122e8 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 892
4  CoreImage                      0x1c18122e8 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 892
5  CoreImage                      0x1c18122e8 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 892
6  CoreImage                      0x1c18122e8 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 892
7  CoreImage                      0x1c18122e8 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 892
8  CoreImage                      0x1c18122e8 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 892
9  CoreImage                      0x1c18122e8 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 892
10 CoreImage                      0x1c18122e8 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 892
11 CoreImage                      0x1c1812f04 CI::Context::render(CI::ProgramNode*, CGRect const&) + 116
12 CoreImage                      0x1c182ca3c invocation function for block in CI::image_render_to_surface(CI::Context*, CI::Image*, CGRect, CGColorSpace*, __IOSurface*, CGPoint, CI::PixelFormat, CI::RenderDestination const*) + 40
13 CoreImage                      0x1c18300bc CI::recursive_tile(CI::RenderTask*, CI::Context*, CI::RenderDestination const*, char const*, CI::Node*, CGRect const&, CI::PixelFormat, CI::swizzle_info const&, CI::TileTask* (CI::ProgramNode*, CGRect) block_pointer) + 608
14 CoreImage                      0x1c182b740 CI::tile_node_graph(CI::Context*, CI::RenderDestination const*, char const*, CI::Node*, CGRect const&, CI::PixelFormat, CI::swizzle_info const&, CI::TileTask* (CI::ProgramNode*, CGRect) block_pointer) + 396
15 CoreImage                      0x1c182c308 CI::image_render_to_surface(CI::Context*, CI::Image*, CGRect, CGColorSpace*, __IOSurface*, CGPoint, CI::PixelFormat, CI::RenderDestination const*) + 1340
16 CoreImage                      0x1c18781c0 -[CIContext(CIRenderDestination) _startTaskToRender:toDestination:forPrepareRender:error:] + 2488
17 CoreImage                      0x1c18777ec -[CIContext(CIRenderDestination) startTaskToRender:fromRect:toDestination:atPoint:error:] + 140
18 CoreImage                      0x1c17c9e4c -[CIContext render:toIOSurface:bounds:colorSpace:] + 268
19 UIKitCore                      0x1e8f41244 -[UIImageView _updateLayerContentsForCIImageBackedImage:] + 880
20 UIKitCore                      0x1e8f38968 -[UIImageView _setImageViewContents:] + 872
21 UIKitCore                      0x1e8f39fd8 -[UIImageView _updateState] + 664
22 UIKitCore                      0x1e8f79650 +[UIView(Animation) performWithoutAnimation:] + 104
23 UIKitCore                      0x1e8f3ff28 -[UIImageView _updateImageViewForOldImage:newImage:] + 504
24 UIKitCore                      0x1e8f3b0ac -[UIImageView setImage:] + 340
25 App                         0x100482434 MyImageView.updateImageView() (<compiler-generated>)
26 App                         0x10048343c closure #1 in MyImageView.handleNotification(_:) + 281 (MyImageView.swift:281)
27 App                         0x1004f1870 thunk for @escaping @callee_guaranteed () -> () (<compiler-generated>)
28 libdispatch.dylib              0x1bbbf4a38 _dispatch_call_block_and_release + 24
29 libdispatch.dylib              0x1bbbf57d4 _dispatch_client_callout + 16
30 libdispatch.dylib              0x1bbbd59e4 _dispatch_main_queue_callback_4CF$VARIANT$armv81 + 1008
31 CoreFoundation                 0x1bc146c1c __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 12
32 CoreFoundation                 0x1bc141b54 __CFRunLoopRun + 1924
33 CoreFoundation                 0x1bc1410b0 CFRunLoopRunSpecific + 436
34 GraphicsServices               0x1be34179c GSEventRunModal + 104
35 UIKitCore                      0x1e8aef978 UIApplicationMain + 212
36 App                         0x1002a3544 main + 18 (AppDelegate.swift:18)
37 libdyld.dylib                  0x1bbc068e0 start + 4

What could be the reason for the crash?


Update

Maybe related to CIImage memory leak. When profiling I see a lot of CIImage memory leaks with the same stack trace as in the crash log:

Img

Maybe related to Core Image and memory leak, swift 3.0. I just found that the images were stored in an array in-memory and onReceiveMemoryWarning was not properly handled and did not clear that array. So the app would crash on memory issues in certain cases. Maybe that fixes the issue, I'll give an update here.


Update 2

It seems I was able to reproduce the crash. Testing on a physical device iPhone Xs Max with a 5MB JPEG image.

  • When displaying the image unblurred full screen the memory usage of the app is 160MB total.
  • When displaying the image blurred in 1/4 of the screen size, the memory usage is 380MB.
  • When displaying the image blurred full screen the memory usage jumps to >1.6GB and the app then crashes most of the time with:

Message from debugger: Terminated due to memory issue

I am surprised the image of 5MB can cause a memory usage of >1.6GB for a "simple" blur. Do I have to manually deallocate anything here, CIContext, CIImage, etc or is that normal and I have to manually resize the image to ~kB before blurring?

Update 3

Adding multiple image views displaying the blurred image causes the memory usage to go up some hundred MB each time an image view is added, until the view is removed, even though only 1 image is visible at a time. Maybe CIFilter is not intended to be used for displaying an image because it occupies more memory than the rendered image itself would.

So I changed the blur function to render the image in context and sure enough, the memory only increases shortly for rendering the image and falls back to pre-blurring levels afterwards.

Here is the updated method:

func blurImage(_ radius: CGFloat) -> UIImage? {

    guard let ciImage = CIImage(image: self) else {
        return nil
    }

    let clampedImage = ciImage.clampedToExtent()

    let blurFilter = CIFilter(name: "CIGaussianBlur", withInputParameters: [
        kCIInputImageKey: clampedImage,
        kCIInputRadiusKey: radius])

    var filteredImage = blurFilter?.outputImage

    filteredImage = filteredImage?.cropped(to: ciImage.extent)

    guard let blurredCiImage = filteredImage else {
        return nil
    }

    let rect = CGRect(origin: CGPoint.zero, size: size)

    UIGraphicsBeginImageContext(rect.size)
    UIImage(ciImage: blurredCiImage).draw(in: rect)
    let blurredImage = UIGraphicsGetImageFromCurrentImageContext()
    UIGraphicsEndImageContext()

    return blurredImage
}

In addition, thanks to @matt and @FrankSchlegel who suggested in the comments that the high memory consumption can be mitigated by downsampling the image before blurring, which I will also do. It is surprising that even an image of 300x300px causes a spike in memory usage of ~500MB. Considering that 2GB is the limit where the app will be terminated. I will post an update once the app is live with these updates.

Update 4

I added this code to downsample the image to a max of 300x300px before blurring it:

func resizeImageWithAspectFit(_ boundSize: CGSize) -> UIImage {

    let ratio = self.size.width / self.size.height
    let maxRatio = boundSize.width / boundSize.height

    var scaleFactor: CGFloat

    if ratio > maxRatio {
        scaleFactor = boundSize.width / self.size.width

    } else {
        scaleFactor = boundSize.height / self.size.height
    }

    let newWidth = self.size.width * scaleFactor
    let newHeight = self.size.height * scaleFactor

    let rect = CGRect(x: 0.0, y: 0.0, width: newWidth, height: newHeight)

    UIGraphicsBeginImageContext(rect.size)
    self.draw(in: rect)
    let newImage = UIGraphicsGetImageFromCurrentImageContext()
    UIGraphicsEndImageContext()

    return newImage!
}

The crashes look different now, but I am unsure whether the crash happens during downsampling or drawing the blurred image as described in Update #3 as both use UIGraphicsImageContext:

EXC_BAD_ACCESS KERN_INVALID_ADDRESS 0x0000000000000010
Crashed: com.apple.root.user-initiated-qos
0  libobjc.A.dylib                0x1ce457530 objc_msgSend + 16
1  CoreImage                      0x1d48773dc -[CIContext initWithOptions:] + 96
2  CoreImage                      0x1d4877358 +[CIContext contextWithOptions:] + 52
3  UIKitCore                      0x1fb7ea794 -[UIImage drawInRect:blendMode:alpha:] + 984
4  MyApp                          0x1005bb478 UIImage.blurImage(_:) (<compiler-generated>)
5  MyApp                          0x100449f58 closure #1 in MyImage.getBlurredImage() + 153 (UIImage+Extension.swift:153)
6  MyApp                          0x1005cda48 thunk for @escaping @callee_guaranteed () -> () (<compiler-generated>)
7  libdispatch.dylib              0x1ceca4a38 _dispatch_call_block_and_release + 24
8  libdispatch.dylib              0x1ceca57d4 _dispatch_client_callout + 16
9  libdispatch.dylib              0x1cec88afc _dispatch_root_queue_drain + 636
10 libdispatch.dylib              0x1cec89248 _dispatch_worker_thread2 + 116
11 libsystem_pthread.dylib        0x1cee851b4 _pthread_wqthread + 464
12 libsystem_pthread.dylib        0x1cee87cd4 start_wqthread + 4

Here are the threads used to resize and blur the image (blurImage() is the method as described in Update #3):

class MyImage {

    var originalImage: UIImage?
    var blurredImage: UIImage?

    // Called on the main thread
    func getBlurredImage() -> UIImage {

        DispatchQueue.global(qos: DispatchQoS.QoSClass.userInitiated).async {

            // Create resized image
            let smallImage = self.originalImage.resizeImageWithAspectFitToSizeLimit(CGSize(width: 1000, height: 1000))

            // Create blurred image
            let blurredImage = smallImage.blurImage()

                DispatchQueue.main.async {

                    self.blurredImage = blurredImage

                    // Notify observers to display `blurredImage` in UIImageView on the main thread
                    NotificationCenter.default.post(name: BlurredImageIsReady, object: nil, userInfo: ni)
                }
            }
        }
    }
}
Manuel
  • 14,274
  • 6
  • 57
  • 130
  • 1
    Hmm… hard to say just from looking at the crash log. Could you maybe post some code how the `CIImage` is created and how the view is updated? – Frank Rupprecht Jul 31 '19 at 18:36
  • Among the details that can help... how are you using `CoreImage`? Why are involving `UIImageView`? Probably the most import - *how* are you using `UIImageView`? –  Aug 02 '19 at 01:42
  • @dfd `CIImage` is used to blur an image with `CIFilter`. I have added the code to the question. – Manuel Aug 06 '19 at 21:46
  • @FrankSchlegel I added the code to the question. – Manuel Aug 06 '19 at 21:47
  • This probably *will not* help your issue, but I find that - at the very least, for performance - it helps to (a) stick with CoreImage until you absolutely need to go to anything else. And for that, I tend to use CoreGraphics to convert a CIImage to UIImage. Like I just said, it probably won't help this issue, but hopefully will explain why I asked about UIImageView. (The alternatives are the deprecated GLKView and MTKView.) –  Aug 06 '19 at 23:04
  • @dfd Are you saying that there is an alternative to using `UIImage(ciImage:)`? Please share. – Manuel Aug 06 '19 at 23:51
  • The `blurImage` method doesn't actually do _anything_ to the image. The actual rendering happens as soon as the `UIImage` that you return is _used_, like when you assign it to a `UIImageView` (like you see in your stack trace). So you don't really need to do `blurImage` in another queue. – Frank Rupprecht Aug 07 '19 at 05:45
  • `UIImageView`, which needs a `UIImage`, is part of `UIKit` and uses the CPU. `GLKView`, which is part of `GLKit` (both deprecated in iOS 12 but still work) use the GPU. `MTKView`, part of `MetalKit`, also uses the GPU. Both of these GPU-based views are *much* better performing - mainly if you want real-time rendering - and can use `CIImage` directly. My usage? Only when I am using a `UIActivityViewController` do I turn my `CIImage` into a `UIImage` using a `CIContext` and `createCGImage`, then using `UIImage(cgImage:)`.... –  Aug 07 '19 at 09:51
  • But it doesn't sound (to me) like *any* of that will solve your issue. Before you edited your question with code, I wondered why you were combining a `CIImage` with a `UIImageView`. –  Aug 07 '19 at 10:00
  • @FrankSchlegel Are you sure the method does not do anything until the image is rendered? After all the filter is applied and the new image is stored in the `blurredImage` variable, no? It's not a lazy variable. – Manuel Aug 07 '19 at 12:13
  • 1
    Asking a `CIFilter` for it's `outputImage` is not applying the filter. Think of a `CIImage` as a _recipe_ for creating an image, that is first evaluated when the image content is actually needed (in this case when the image view needs actual pixels to display). So `CIImage`s are inherently lazy. – Frank Rupprecht Aug 07 '19 at 12:17
  • Important insight. That's why I was confused as to at which step the crash was happening. So the crash happens at rendering, which I guess can be a memory intensive task. Which makes me think even more that the crash is really an out-of-memory issue. – Manuel Aug 07 '19 at 12:42
  • @FrankSchlegel Please see my update #2 in the question. Do you have an idea regarding the mem consumption? – Manuel Aug 09 '19 at 17:32
  • You have not shown enough code. Provide code to _reproduce the issue_. What's `getImage`, for example? – matt Aug 09 '19 at 17:34
  • @matt happy to see you join in. `getImage` actually just returns the image or downloads it async if necessary. I corrected the code, `MyImage` is actually a plain class, not a child of `UIImage`. – Manuel Aug 09 '19 at 17:38
  • 1
    Well, the question remains, can you condense the entire issue, esp. obviously the large memory usage, into a simple _reproducible_ example? I mean, I can blur a 5MB JPEG image without using 1.6 GB of memory. So until you show me, I can't imagine how you do that. And as has been pointed out, merely saying `UIImage(ciImage: finalImage)` doesn't do _anything_; it effectively uses _zero_ memory, because all you've done is create the instructions for making an image, you haven't actually made the image yet. So the question is the creation and disposal of the image. – matt Aug 09 '19 at 17:45
  • I will do that and post an update. – Manuel Aug 09 '19 at 17:48
  • 1
    Sorry to rattle on, but it would also be important to know the _dimensions_ of the image. Saying "5 MB" doesn't tell us that, because a JPEG is compressed. A big image (big in terms of dimensions) is going to occupy a lot of memory when you display it even if you _don't_ pass it thru a filter! – matt Aug 09 '19 at 17:49
  • @matt Please rattle on :) You can find the sample image I tested with here as "5MB jpeg" https://sample-videos.com/download-sample-jpg-image.php. But as I said in my update #2, displaying the image without blurring does not cause the memory increase. – Manuel Aug 10 '19 at 00:42
  • I use a blur radius of `100`, maybe worth to note; added to the code. – Manuel Aug 10 '19 at 01:05
  • 1
    Blurring a 34 megapixel image with a radius of 100 is _super duper expensive_, regardless of the clever optimizations CI is doing under the hood. Even with linear separation, ~400 pixels need to be read in order to process _a single pixel_ in the output, that's 13,648,133,216 pixel reads (!) total. Aside from that, the uncompressed image _alone_ needs ~131 MB of memory, let alone all the temporary resources also required. I think you're hitting hard hardware limitations right there. You need to downscale the image first and blur it with a smaller radius to stay within hardware limits. – Frank Rupprecht Aug 10 '19 at 08:10
  • That makes sense. In a test app that only blurs and displays the same image the memory usage “only” jumps to ~300MB and ~70MB top display the unblurred image. Using a 30MB image the memory usage jumps to ~1GB then down to ~600MB, in comparison. And with every image view added, the mem usage increases by the same amount. So maybe there is a mem leak in the released app, but however, downsampling the image before blur seems reasonable. – Manuel Aug 10 '19 at 10:37
  • @FrankSchlegel Adding multiple image views on top of each other displaying the unblurred image does not notably increase the mem usage. Adding multiple image views displaying the blurred image causes the mem usage to go up some hundred MB each time an image view is added, until the view is removed, even though only 1 is visible. It would be more efficient for me to render the blurred image once in a `UIGraphicsImageContext` and simply store it. Maybe `CIFilter` is not intended to be used while displaying an image because it occupies more memory than the rendered image itself. – Manuel Aug 10 '19 at 13:19
  • And sure enough, that's the way to go, added to my questions as update #3. – Manuel Aug 10 '19 at 13:30
  • So basically this ends up as a combination of two very well-known and often-repeated rules: (1) never use an image larger than needed for display (forcing the UIImageView to display a huge image at reduced size uses all the memory of the huge image, unnecessarily), and (2) render your CIImage cleanly by drawing the derived UIImage into an image context, rather than making the image view perform the rendering (which in my experience never works reliably anyway). – matt Aug 10 '19 at 20:44
  • While the first rule is obvious, the second rule was new to me. – Manuel Aug 10 '19 at 20:46
  • @matt I added downsampling and get different crashes now, could you please take a look at update #4 in the question? – Manuel Aug 16 '19 at 18:14
  • Looks like a threading problem from here. – matt Aug 16 '19 at 18:17
  • I wonder, does `UIGraphicsImageContext` have to be accessed only from main thread, as it is a `UI` class? – Manuel Aug 16 '19 at 19:59
  • I answer this myself, the docs say about `UIGraphicsGetImageFromCurrentImageContext`: "This function may be called from any thread of your app." – Manuel Aug 17 '19 at 01:13
  • That’s irrelevant. What’s crashing is the rendering. Core Image filters are not thread safe. See for example https://stackoverflow.com/questions/14109671/ios-core-image-and-multi-threaded-apps – matt Aug 17 '19 at 07:07

2 Answers2

2

I did some benchmarking and found it feasible to blur and display very large image when rendering directly into a MTKView, even when the processing happens on the original input size. Here is the whole testing code:

import CoreImage
import MetalKit
import UIKit

class ViewController: UIViewController {

    var device: MTLDevice!
    var commandQueue: MTLCommandQueue!
    var context: CIContext!
    let filter = CIFilter(name: "CIGaussianBlur")!
    let testImage = UIImage(named: "test10")! // 10 MB, 40 MP image
    @IBOutlet weak var metalView: MTKView!

    override func viewDidLoad() {
        super.viewDidLoad()

        self.device = MTLCreateSystemDefaultDevice()
        self.commandQueue = self.device.makeCommandQueue()

        self.context = CIContext(mtlDevice: self.device)

        self.metalView.delegate = self
        self.metalView.device = self.device
        self.metalView.isPaused = true
        self.metalView.enableSetNeedsDisplay = true
        self.metalView.framebufferOnly = false
    }

}

extension ViewController: MTKViewDelegate {

    func draw(in view: MTKView) {
        guard let currentDrawable = view.currentDrawable,
              let commandBuffer = self.commandQueue.makeCommandBuffer() else { return }

        let input = CIImage(image: self.testImage)!

        self.filter.setValue(input.clampedToExtent(), forKey: kCIInputImageKey)
        self.filter.setValue(100.0, forKey: kCIInputRadiusKey)
        let output = self.filter.outputImage!.cropped(to: input.extent)

        let drawableSize = view.drawableSize

        // Scale image to aspect-fit view.
        // NOTE: This is a benchmark scenario. Usually you would scale the image to a reasonable processing size
        //       (i.e. close to your output size) _before_ applying expensive filters.
        let scaleX = drawableSize.width / output.extent.width
        let scaleY = drawableSize.height / output.extent.height
        let scale = min(scaleX, scaleY)
        let scaledOutput = output.transformed(by: CGAffineTransform(scaleX: scale, y: scale))

        let destination = CIRenderDestination(mtlTexture: currentDrawable.texture, commandBuffer: commandBuffer)
        // BONUS: You can Quick Look the `task` in Xcode to see what Core Image is actually going to do on the GPU.
        let task = try! self.context.startTask(toRender: scaledOutput, to: destination)

        commandBuffer.present(currentDrawable)
        commandBuffer.commit()

        // BONUS: No need to wait, but you can Quick Look the `info` to see what was actually done during rendering
        //        and to get performance metrics, like the actual number of pixels processed.
        DispatchQueue.global(qos: .background).async {
            let info = try! task.waitUntilCompleted()
        }
    }

    func mtkView(_ view: MTKView, drawableSizeWillChange size: CGSize) {}

}

For the 10 MB testing image (40 mega-pixel!) the memory spiked up to 800 MB very briefly during rendering, which is to be expected. I even tried the 30 MB (~74 mega-pixel!!) image and it went through without problem, using 1.3 GB of memory tops.

When I scaled the image to the destination before applying the filter, the memory stayed at ~60 MB all the time. So this is really what you should be doing in any case. But note that you need to change the radius of the gaussian blur in this case to achieve the same result.

If you need the rendering result not only for displaying, I guess you could use the createCGImage API of CIContext instead of rendering into the MTKView's drawable and get the same memory usage.

I hope this is applicable to your scenario.

Frank Rupprecht
  • 9,191
  • 31
  • 56
  • Interesting example, I experienced about the same memory spikes rendering the blurred image in my update #3 in the question, but even when scaling to a 300x300px image the spike was some hundred MB. Are you sure there was no spike for the scaled image in your example, or was the spike so short that the profiler did not record it? – Manuel Aug 11 '19 at 13:09
  • Interestingly, there was a very short ~200 MB spike when rendering the 10 MB image, but _not_ when rendering the 30 MB image... – Frank Rupprecht Aug 11 '19 at 13:27
  • My guess is that the latency of the profiler does sometimes not record spikes that last only for a fraction of a second. – Manuel Aug 11 '19 at 13:37
  • It should, though. My guess is that the difference occurs due to the implementation of the gaussian filter. Different scaling and other optimization techniques are used depending on the input size and radius parameter. Those likely have different runtime and memory requirements. – Frank Rupprecht Aug 11 '19 at 13:41
0

This appears to be a simple threading issue. CIFilter is not thread safe. You cannot form a chain of filters on one thread and then render the resulting CIImage on another thread. You should confine yourself to small images and do everything on the main thread and render explicitly using the GPU. That’s what Core Image is all about.

matt
  • 515,959
  • 87
  • 875
  • 1,141
  • I added the thread info to update #4 in the question. In my understanding the image is resized and blurred on `DispatchQueue.global(qos: DispatchQoS.QoSClass.userInitiated).async`. I think the `CIFilter` is only relevant until the blurred image is drawn in the `UIGraphicsImageContext`, because after that it's stored in a var as a `UIImage`? – Manuel Aug 17 '19 at 10:28
  • http://sealiesoftware.com/blog/archive/2008/09/22/objc_explain_So_you_crashed_in_objc_msgSend.html – matt Aug 17 '19 at 12:13
  • Also using Core Image in the background is a terrible idea, because you can’t render on the GPU which is the whole point. See https://stackoverflow.com/questions/25431317/coreimage-very-high-memory-usage of which I’m more and more inclined to think this is a duplicate. A never ending duplicate. – matt Aug 17 '19 at 16:52
  • Then my question is simple: how can I create a blurred image? I cannot do that on the main thread, it would block the UI. So I do it on a background thread, maybe using CPU instead of GPU, and it doesn’t matter to me if it takes longer. So the image is drawn in a context on a background thread and then stored in a var. what is the alternative? – Manuel Aug 17 '19 at 17:09