3

How can I use GLKView in SwiftUI? I'm using CIFilter but would like to apply filters through GLKit / OpenGL. Any ideas?

struct ContentView: View {
    @State private var image: Image?

    var body: some View {
        VStack {
            image?
                .resizable()
                .scaledToFit()
        }
        .onAppear(perform: loadImage)
    }

    func loadImage() {
        guard let inputImage = UIImage(named: "squirrel") else {
            return
        }
        let ciImage = CIImage(image: inputImage)

        let context = CIContext()           
        let blur = CIFilter.gaussianBlur()
        blur.inputImage = ciImage
        blur.radius = 20

        guard let outputImage = blur.outputImage else {
            return
        }
        if let cgImg = context.createCGImage(outputImage, from: ciImage!.extent) {
            let uiImg = UIImage(cgImage: cgImg)
            image = Image(uiImage: uiImg)
        }
    }
}
Aнгел
  • 1,361
  • 3
  • 17
  • 32
  • I've done it by using a `UIViewRepresentable`. Basically, my `UIKit` apps all use a subclassed `GLKView` (for reasons like creating a scaled `UIImage` when needed, etc.) and it actually was pretty trivial too turn that into a `UIViewRepresentable`. If you want, I'll dig out the code - I decided that SwiftUI is way too new for a full production app back in Beta 5 days - and try to post it after this weekend. –  Jan 19 '20 at 07:35
  • One additional comment - keep in mind that *all* things OpenGL/GLKit were deprecated as of iOS 12. I'm decently hopeful that they'll still work com iOS 14, but.... Apple wants you to use Metal, in this case MetalKit and `MTKView` instead. (My production apps do not - yet.) –  Jan 19 '20 at 07:38

2 Answers2

3

Here's a working GLKView in SwiftUI using UIViewControllerRepresentable.

A few things to keep in mind.

  • GLKit was deprecated with the release of iOS 12, nearly 2 years ago. While I hope Apple won't kill it anytime soon (way too many apps still use it), they recommend using Metal or an MTKView instead. Most of the technique here is still the way to go for SwiftUI.

  • I worked with SwiftUI in hopes of making my next CoreImage app be a "pure" SwiftUI app until I had too many UIKit needs to bring in. I stopped working on this around Beta 6. The code works but is clearly not production ready. The repo for this is here.

  • I'm more comfortable working with models instead of putting code for things like using a CIFilter directly in my views. I'll assume you know how to create a view model and have it be an EnvironmentObject. If not look at my code in the repo.

  • Your code references a SwiftUI Image view - I never found any documentation that suggests it uses the GPU (as a GLKView does) so you won't find anything like that in my code. If you are looking for real-time performance when changing attributes, I found this to work very well.

Starting with a GLKView, here's my code:

class ImageView: GLKView {

    var renderContext: CIContext
    var myClearColor:UIColor!
    var rgb:(Int?,Int?,Int?)!

    public var image: CIImage! {
        didSet {
            setNeedsDisplay()
        }
    }
    public var clearColor: UIColor! {
        didSet {
            myClearColor = clearColor
        }
    }

    public init() {
        let eaglContext = EAGLContext(api: .openGLES2)
        renderContext = CIContext(eaglContext: eaglContext!)
        super.init(frame: CGRect.zero)
        context = eaglContext!
    }

    override public init(frame: CGRect, context: EAGLContext) {
        renderContext = CIContext(eaglContext: context)
        super.init(frame: frame, context: context)
        enableSetNeedsDisplay = true
    }

    public required init?(coder aDecoder: NSCoder) {
        let eaglContext = EAGLContext(api: .openGLES2)
        renderContext = CIContext(eaglContext: eaglContext!)
        super.init(coder: aDecoder)
        context = eaglContext!
    }

    override public func draw(_ rect: CGRect) {
        if let image = image {
            let imageSize = image.extent.size
            var drawFrame = CGRect(x: 0, y: 0, width: CGFloat(drawableWidth), height: CGFloat(drawableHeight))
            let imageAR = imageSize.width / imageSize.height
            let viewAR = drawFrame.width / drawFrame.height
            if imageAR > viewAR {
                drawFrame.origin.y += (drawFrame.height - drawFrame.width / imageAR) / 2.0
                drawFrame.size.height = drawFrame.width / imageAR
            } else {
                drawFrame.origin.x += (drawFrame.width - drawFrame.height * imageAR) / 2.0
                drawFrame.size.width = drawFrame.height * imageAR
            }
            rgb = (0,0,0)
            rgb = myClearColor.rgb()
            glClearColor(Float(rgb.0!)/256.0, Float(rgb.1!)/256.0, Float(rgb.2!)/256.0, 0.0);
            glClear(0x00004000)
            // set the blend mode to "source over" so that CI will use that
            glEnable(0x0BE2);
            glBlendFunc(1, 0x0303);
            renderContext.draw(image, in: drawFrame, from: image.extent)
        }
    }

}

This is very old production code, taken from objc.io issue 21 dated February 2015! Of note is that it encapsulates a CIContext, needs it's own clear color defined before using it's draw method, and renders an image as scaleAspectFit. If you should try using this in UIKit, it'll like work perfectly.

Next, a "wrapper" UIViewController:

class ImageViewVC: UIViewController {
    var model: Model!
    var imageView = ImageView()

    override func viewDidLoad() {
        super.viewDidLoad()
        view = imageView
        NotificationCenter.default.addObserver(self, selector: #selector(updateImage), name: .updateImage, object: nil)
    }
    override func viewDidLayoutSubviews() {
        imageView.setNeedsDisplay()
    }
    override func traitCollectionDidChange(_ previousTraitCollection: UITraitCollection?) {
        if traitCollection.userInterfaceStyle == .light {
            imageView.clearColor = UIColor.white
        } else {
            imageView.clearColor = UIColor.black
        }
    }
    @objc func updateImage() {
        imageView.image = model.ciFinal
        imageView.setNeedsDisplay()
    }
}

I did this for a few reasons - pretty much adding up to the fact that i'm not a Combine expert.

First, note that the view model (model) cannot access the EnvironmentObject directly. That's a SwiftUI object and UIKit doesn't know about it. I think an ObservableObject *may work, but never found the right way to do it.

Second, note the use of NotificationCenter. I spent a week last year trying to get Combine to "just work" - particularly in the opposite direction of having a UIButton tap notify my model of a change - and found that this is really the easiest way. It's even easier than using delegate methods.

Next, exposing the VC as a representable:

struct GLKViewerVC: UIViewControllerRepresentable {
    @EnvironmentObject var model: Model
    let glkViewVC = ImageViewVC()

    func makeUIViewController(context: Context) -> ImageViewVC {
        return glkViewVC
    }
    func updateUIViewController(_ uiViewController: ImageViewVC, context: Context) {
        glkViewVC.model = model
    }
}

The only thing of note is that here's where I set the model variable in the VC. I'm sure it's possible to get rid of the VC entirely and have a UIViewRepresentable, but I'm more comfortable with this set up.

Next, my model:

class Model : ObservableObject {
    var objectWillChange = PassthroughSubject<Void, Never>()

    var uiOriginal:UIImage?
    var ciInput:CIImage?
    var ciFinal:CIImage?

    init() {
        uiOriginal = UIImage(named: "vermont.jpg")
        uiOriginal = uiOriginal!.resizeToBoundingSquare(640)
        ciInput = CIImage(image: uiOriginal!)?.rotateImage()
        let filter = CIFilter(name: "CIPhotoEffectNoir")
        filter?.setValue(ciInput, forKey: "inputImage")
        ciFinal = filter?.outputImage
    }
}

Nothing to see here at all, but understand that in SceneDelegate, where you instantiate this, it will trigger the init and set up the filtered image.

Finally, ContentView:

struct ContentView: View {
    @EnvironmentObject var model: Model
    var body: some View {
        VStack {
            GLKViewerVC()
            Button(action: {
                self.showImage()
            }) {
                VStack {
                    Image(systemName:"tv").font(Font.body.weight(.bold))
                    Text("Show image").font(Font.body.weight(.bold))
                }
                .frame(width: 80, height: 80)
            }
        }
    }
    func showImage() {
        NotificationCenter.default.post(name: .updateImage, object: nil, userInfo: nil)
    }
}

SceneDelegate instantiates the view model which now has the altered CIImage, and the button beneath the GLKView (an instance of GLKViewVC, which is just a SwiftUI View) will send a notification to update the image.

  • thanks @dfd, I'm checking the code now :) Prior SwiftUI, I had a similar implementation like yours (subclassed GLKView) but with the new paradigm in SwiftUI, it's a bit confusing how to do things now - like using GLKit or Metal :/ It's so complicated – Aнгел Jan 20 '20 at 23:01
  • Two key things I learned. (1) With representables, everything is a View - until it isn't. The connection between bindables, the UIKit view and view controller lifecycle, etc. forced me to back off the deeper I dug into things. (2) I ended up *needing* at least three representables - UIImagePickerController, UIActivityViewController, and GLKView. None of these exist in SwiftUI. While I agree with you about the paradigm shift, these really are unfinished pieces that Apple may be years away from porting to SwiftUI. –  Jan 21 '20 at 02:54
0

Apple's WWDC 2022 contained a tutorial/video entitled "Display EDR Content with Core Image, Metal, and SwiftUI" which describes how to blend Core Image with Metal and SwiftUI. It points to some new sample code entitled "Generating an Animation with a Core Image Render Destination" (here).

While it doesn't address your question about using GLKView, it does provide some elegant, clean, Apple-sanctioned code for using Metal within SwiftUI.

This sample project is very CoreImage-centric (which matches your background with CIFilter), but I wish Apple would post more sample-code examples showing Metal integrated with SwiftUI.

KeithB
  • 417
  • 2
  • 12