4

All I want to do is take the basic arkit view and turn it into a black and white view. Right now the basic view is just normal and I have no idea on how add the filter. Ideally when taking a screenshot the black and white filter is added onto the screenshot.

import UIKit
import SceneKit
import ARKit

class ViewController: UIViewController, ARSCNViewDelegate {

    @IBOutlet var sceneView: ARSCNView!

    override func viewDidLoad() {
        super.viewDidLoad()
        sceneView.delegate = self
        sceneView.showsStatistics = true
    }

    override func viewWillAppear(_ animated: Bool) {
        super.viewWillAppear(animated)
        let configuration = ARWorldTrackingConfiguration()
        sceneView.session.run(configuration)
    }

    override func viewWillDisappear(_ animated: Bool) {
        super.viewWillDisappear(animated)
        sceneView.session.pause()
    }

    @IBAction func changeTextColour(){
        let snapShot = self.augmentedRealityView.snapshot()
        UIImageWriteToSavedPhotosAlbum(snapShot, self, #selector(image(_:didFinishSavingWithError:contextInfo:)), nil)
    }
}
Lësha Turkowski
  • 1,361
  • 1
  • 10
  • 21

4 Answers4

12

If you want to apply the filter in real-time the best way to achieve that is to use SCNTechnique. Techniques are used for postprocessing and allow us to render an SCNView content in several passes – exactly what we need (first render a scene, then apply an effect to it).

Here's the example project.


Plist setup

First, we need to describe a technique in a .plist file.

Here's a screenshot of a plist that I've come up with (for better visualization):

plist describing an SCNTechnique

And here's it's source:

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
    <key>sequence</key>
    <array>
        <string>apply_filter</string>
    </array>
    <key>passes</key>
    <dict>
        <key>apply_filter</key>
        <dict>
            <key>metalVertexShader</key>
            <string>scene_filter_vertex</string>
            <key>metalFragmentShader</key>
            <string>scene_filter_fragment</string>
            <key>draw</key>
            <string>DRAW_QUAD</string>
            <key>inputs</key>
            <dict>
                <key>scene</key>
                <string>COLOR</string>
            </dict>
            <key>outputs</key>
            <dict>
                <key>color</key>
                <string>COLOR</string>
            </dict>
        </dict>
    </dict>
</dict>

The topic of SCNTechniques is quire broad and I will only quickly cover the things we need for the case at hand. To get a real understanding of what they are capable of I recommend reading Apple's comprehensive documentation on techniques.

Technique description

passes is a dictionary containing description of passes that you want an SCNTechnique to perform.

sequence is an array that specifies an order in which these passes are going to be performed using their keys.

You do not specify the main render pass here (meaning whatever is rendered without applying SCNTechniques) – it is implied and it's resulting color can be accessed using COLOR constant (more on it in a bit).

So the only "extra" pass (besides the main one) that we are going to do will be apply_filter that converts colors into black and white (it can be named whatever you want, just make sure it has the same key in passes and sequence).

Now to the description of the apply_filter pass itself.

Render pass description

metalVertexShader and metalFragmentShader – names of Metal shader functions that are going to be used for drawing.

draw defines what the pass is going to render. DRAW_QUAD stands for:

Render only a rectangle covering the entire bounds of the view. Use this option for drawing passes that process image buffers output by earlier passes.

which means, roughly speaking, that we are going to be rendering a plain "image" with out render pass.

inputs specifies input resources that we will be able to use in shaders. As I previously said, COLOR refers to a color data provided by a main render pass.

outputs specifies outputs. It can be color, depth or stencil, but we only need a color output. COLOR value means that we, simply put, are going to be rendering "directly" to the screen (as opposed to rendering into intermediate targets, for example).


Metal shader

Create a .metal file with following contents:

#include <metal_stdlib>
using namespace metal;
#include <SceneKit/scn_metal>

struct VertexInput {
    float4 position [[ attribute(SCNVertexSemanticPosition) ]];
    float2 texcoord [[ attribute(SCNVertexSemanticTexcoord0) ]];
};

struct VertexOut {
    float4 position [[position]];
    float2 texcoord;
};

// metalVertexShader
vertex VertexOut scene_filter_vertex(VertexInput in [[stage_in]])
{
    VertexOut out;
    out.position = in.position;
    out.texcoord = float2((in.position.x + 1.0) * 0.5 , (in.position.y + 1.0) * -0.5);
    return out;
}

// metalFragmentShader
fragment half4 scene_filter_fragment(VertexOut vert [[stage_in]],
                                    texture2d<half, access::sample> scene [[texture(0)]])
{
    constexpr sampler samp = sampler(coord::normalized, address::repeat, filter::nearest);
    constexpr half3 weights = half3(0.2126, 0.7152, 0.0722);

    half4 color = scene.sample(samp, vert.texcoord);
    color.rgb = half3(dot(color.rgb, weights));

    return color;
}

Notice, that the function names for fragment and vertex shaders should be the same names that are specified in the plist file in the pass descriptor.

To get a better understanding of what VertexInput and VertexOut structures mean, refer to the SCNProgram documentation.

The given vertex function can be used pretty much in any DRAW_QUAD render pass. It basically gives us normalized coordinates of the screen space (that are accessed with vert.texcoord in the fragment shader).

The fragment function is where all the "magic" happens. There, you can manipulate the texture that you've got from the main pass. Using this setup you can potentially implement a ton of filters/effects and more.

In our case, I used a basic desaturation (zero saturation) formula to get the black and white colors.


Swift setup

Now, we can finally use all of this in the ARKit/SceneKit.

let plistName = "SceneFilterTechnique" // the name of the plist you've created

guard let url = Bundle.main.url(forResource: plistName, withExtension: "plist") else {
    fatalError("\(plistName).plist does not exist in the main bundle")
}

guard let dictionary = NSDictionary(contentsOf: url) as? [String: Any] else {
    fatalError("Failed to parse \(plistName).plist as a dictionary")
}

guard let technique = SCNTechnique(dictionary: dictionary) else {
    fatalError("Failed to initialize a technique using \(plistName).plist")
}

and just set it as technique of the ARSCNView.

sceneView.technique = technique

That's it. Now the whole scene is going to be rendered in grayscale including when taking snapshots.

enter image description here

Lësha Turkowski
  • 1,361
  • 1
  • 10
  • 21
9

Filter ARSCNView Snapshot: If you want to create a black and white screenShot of your ARSCNView you can do something like this which returns a UIImage in GrayScale and whereby augmentedRealityView refers to an ARSCNView:

/// Converts A UIImage To A High Contrast GrayScaleImage
///
/// - Returns: UIImage
func highContrastBlackAndWhiteFilter() -> UIImage?
{
    //1. Convert It To A CIIamge
    guard let convertedImage = CIImage(image: self) else { return nil }

    //2. Set The Filter Parameters
    let filterParameters = [kCIInputBrightnessKey: 0.0,
                            kCIInputContrastKey:   1.1,
                            kCIInputSaturationKey: 0.0]

    //3. Apply The Basic Filter To The Image
    let imageToFilter = convertedImage.applyingFilter("CIColorControls", parameters: filterParameters)

    //4. Set The Exposure
    let exposure =  [kCIInputEVKey: NSNumber(value: 0.7)]

    //5. Process The Image With The Exposure Setting
    let processedImage = imageToFilter.applyingFilter("CIExposureAdjust", parameters: exposure)

    //6. Create A CG GrayScale Image
    guard let grayScaleImage = CIContext().createCGImage(processedImage, from: processedImage.extent) else { return nil }

    return UIImage(cgImage: grayScaleImage, scale: self.scale, orientation: self.imageOrientation)
}

An example of using this therefore could be like so:

 override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {

    //1. Create A UIImageView Dynamically
    let imageViewResult = UIImageView(frame: CGRect(x: 0, y: 0, width: self.view.bounds.width, height: self.view.bounds.height))
    self.view.addSubview(imageViewResult)

    //2. Create The Snapshot & Get The Black & White Image
    guard let snapShotImage = self.augmentedRealityView.snapshot().highContrastBlackAndWhiteFilter() else { return }
    imageViewResult.image = snapShotImage

    //3. Remove The ImageView After A Delay Of 5 Seconds
    DispatchQueue.main.asyncAfter(deadline: .now() + 5) {
        imageViewResult.removeFromSuperview()
    }

}

Which will yield a result something like this:

enter image description here

In order to make your code reusable you could also create an extension of `UIImage:

//------------------------
//MARK: UIImage Extensions
//------------------------

extension UIImage
{

    /// Converts A UIImage To A High Contrast GrayScaleImage
    ///
    /// - Returns: UIImage
    func highContrastBlackAndWhiteFilter() -> UIImage?
    {
        //1. Convert It To A CIIamge
        guard let convertedImage = CIImage(image: self) else { return nil }

        //2. Set The Filter Parameters
        let filterParameters = [kCIInputBrightnessKey: 0.0,
                                kCIInputContrastKey:   1.1,
                                kCIInputSaturationKey: 0.0]

        //3. Apply The Basic Filter To The Image
        let imageToFilter = convertedImage.applyingFilter("CIColorControls", parameters: filterParameters)

        //4. Set The Exposure
        let exposure =  [kCIInputEVKey: NSNumber(value: 0.7)]

        //5. Process The Image With The Exposure Setting
        let processedImage = imageToFilter.applyingFilter("CIExposureAdjust", parameters: exposure)

        //6. Create A CG GrayScale Image
        guard let grayScaleImage = CIContext().createCGImage(processedImage, from: processedImage.extent) else { return nil }

        return UIImage(cgImage: grayScaleImage, scale: self.scale, orientation: self.imageOrientation)
    }

}

Which you can then use easily like so:

guard let snapShotImage = self.augmentedRealityView.snapshot().highContrastBlackAndWhiteFilter() else { return }

Remembering that you should place your extension above your class declaration e.g:

extension UIImage{

}

class ViewController: UIViewController, ARSCNViewDelegate {

}

So based on the code provided in your question you would have something like this:

/// Creates A Black & White ScreenShot & Saves It To The Photo Album
@IBAction func changeTextColour(){

    //1. Create A Snapshot
    guard let snapShotImage = self.augmentedRealityView.snapshot().highContrastBlackAndWhiteFilter() else { return }

    //2. Save It The Photos Album
    UIImageWriteToSavedPhotosAlbum(snapShotImage, self, #selector(image(_:didFinishSavingWithError:contextInfo:)), nil)

}

///Calback To Check Whether The Image Has Been Saved
@objc func image(_ image: UIImage, didFinishSavingWithError error: Error?, contextInfo: UnsafeRawPointer) {

    if let error = error {
        print("Error Saving ARKit Scene \(error)")
    } else {
        print("ARKit Scene Successfully Saved")
    }
}

Live Rendering In Black & White: Using this brilliant answer here by diviaki I was also able to get the entire camera feed to render in Black and White using the following methods:

1st. Register for the ARSessionDelegate like so:

 augmentedRealitySession.delegate = self

2nd. Then in the following delegate callback add the following:

 //-----------------------
 //MARK: ARSessionDelegate
 //-----------------------

 extension ViewController: ARSessionDelegate{

 func session(_ session: ARSession, didUpdate frame: ARFrame) {

        /*
        Full Credit To https://stackoverflow.com/questions/45919745/reliable-access-and-modify-captured-camera-frames-under-scenekit
        */

        //1. Convert The Current Frame To Black & White
        guard let currentBackgroundFrameImage = augmentedRealityView.session.currentFrame?.capturedImage,
              let pixelBufferAddressOfPlane = CVPixelBufferGetBaseAddressOfPlane(currentBackgroundFrameImage, 1) else { return }

        let x: size_t = CVPixelBufferGetWidthOfPlane(currentBackgroundFrameImage, 1)
        let y: size_t = CVPixelBufferGetHeightOfPlane(currentBackgroundFrameImage, 1)
        memset(pixelBufferAddressOfPlane, 128, Int(x * y) * 2)

      }

 }

Which successfully renders the camera feed Black & White:

enter image description here

Filtering Elements Of An SCNScene In Black & White:

As @Confused rightly said, If you decided that you wanted the cameraFeed to be in colour, but the contents of your AR Experience to be in Black & White you can apply a filter directly to an SCNNode using it's filters property which is simply:

An array of Core Image filters to be applied to the rendered contents of the node.

Let's say for example that we dynamically create 3 SCNNodes with a Sphere Geometry we can apply a CoreImageFilter to these directly like so:

/// Creates 3 Objects And Adds Them To The Scene (Rendering Them In GrayScale)
func createObjects(){

    //1. Create An Array Of UIColors To Set As The Geometry Colours
    let colours = [UIColor.red, UIColor.green, UIColor.yellow]

    //2. Create An Array Of The X Positions Of The Nodes
    let xPositions: [CGFloat] = [-0.3, 0, 0.3]

    //3. Create The Nodes & Add Them To The Scene
    for i in 0 ..< 3{

        let sphereNode = SCNNode()
        let sphereGeometry = SCNSphere(radius: 0.1)
        sphereGeometry.firstMaterial?.diffuse.contents = colours[i]
        sphereNode.geometry = sphereGeometry
        sphereNode.position = SCNVector3( xPositions[i], 0, -1.5)
        augmentedRealityView.scene.rootNode.addChildNode(sphereNode)

        //a. Create A Black & White Filter
        guard let blackAndWhiteFilter = CIFilter(name: "CIColorControls", withInputParameters: [kCIInputSaturationKey:0.0]) else { return }
        blackAndWhiteFilter.name = "bw"
        sphereNode.filters = [blackAndWhiteFilter]
        sphereNode.setValue(CIFilter(), forKeyPath: "bw")
    }

}

Which will yield a result something like the following:

enter image description here

For a full list of these filters you can refer to the following: CoreImage Filter Reference

Example Project: Here is a complete Example Project which you can download and explore for yourself.

Hope it helps...

BlackMirrorz
  • 7,217
  • 2
  • 20
  • 31
  • I am having trouble saving the image. From you example project the image is not saving even though all of the code looks right. –  May 15 '18 at 20:19
  • I have now updated the code, and the example project. Please also mark my answer as correct :) – BlackMirrorz May 16 '18 at 00:26
  • Great coding. If i wanted to keep the original colors of the 3 objects when taking a photo. How would I do that? black and white background with 3 colored balls over it. –  May 16 '18 at 04:33
  • The live rendering technique doesn't work for me because `session(_:didUpdate:)` is only called every few seconds – zakdances Aug 03 '19 at 11:06
  • @BlackMirrorz Hi, I want to integrate color filter in ARKit using CIFilter, I have already posted my question on site, please have a look, https://stackoverflow.com/questions/58501761/live-camera-is-getting-stretched-while-rendering-using-cifilter-swift-4 if you have any batter idea to integrate it please share with me, Thank you. – Anand Nanavaty Oct 23 '19 at 05:47
  • For the Live Rendering In Black & White example, for me only some frames are black and white, not all frames, resulting in a "flickering" effect. – JCutting8 Jun 08 '20 at 05:34
  • @BlackMirrorz Hi, I am still trying to obtain 2D image coordinates for 3D face tracking vertices. I obtained some 2D points using project point, however, but they do not seem to match with the MATLAB pixel coordinates. Could you please take a look and give me a hand if you have time and I could not find much info on this https://stackoverflow.com/questions/67305259/projectpoint-for-getting-2d-image-coordinates-in-arkit – swiftlearneer Apr 30 '21 at 05:28
0

The snapshot object should be an UIImage. Apply filters on this UIImage object by importing CoreImage framework and then apply Core Image filters on it. You should be adjusting the exposure and control values on the image. For more implementation details check this answer . From iOS6, you can also use CIColorMonochromefilter to achieve the same effect.

Here is the apple documentation for all the available filters. Click on each of the filters, to know the visual effects on the image upon application of the filter.

Here is the swift 4 code.

 func imageBlackAndWhite() -> UIImage?
    {
        if let beginImage = CoreImage.CIImage(image: self)
        {
            let paramsColor: [String : Double] = [kCIInputBrightnessKey: 0.0,
                                                  kCIInputContrastKey:   1.1,
                                                  kCIInputSaturationKey: 0.0]
            let blackAndWhite = beginImage.applyingFilter("CIColorControls", parameters: paramsColor)

            let paramsExposure: [String : AnyObject] = [kCIInputEVKey: NSNumber(value: 0.7)]
            let output = blackAndWhite.applyingFilter("CIExposureAdjust", parameters: paramsExposure)

            guard let processedCGImage = CIContext().createCGImage(output, from: output.extent) else {
                return nil
            }

            return UIImage(cgImage: processedCGImage, scale: self.scale, orientation: self.imageOrientation)
        }
        return nil
    }
Arun Balakrishnan
  • 1,462
  • 1
  • 12
  • 24
0

This might be the easiest and fastest way to do this:

Apply a CoreImage Filter to the Scene:

https://developer.apple.com/documentation/scenekit/scnnode/1407949-filters

This filter gives a very good impression of a black and white photograph, with good transitions through grays: https://developer.apple.com/library/content/documentation/GraphicsImaging/Reference/CoreImageFilterReference/index.html#//apple_ref/doc/filter/ci/CIPhotoEffectMono

You could also use this one, and get results easy to shift in hue, too:

https://developer.apple.com/library/content/documentation/GraphicsImaging/Reference/CoreImageFilterReference/index.html#//apple_ref/doc/filter/ci/CIColorMonochrome

And here, in Japanese, is the proof of filters and SceneKit ARKit working together: http://appleengine.hatenablog.com/entry/advent20171215

Confused
  • 6,048
  • 6
  • 34
  • 75