ARKit Camera Feed to Black & White

I am new to ARKit (and iOS dev)... I've built a very simple AR experience - importing a 3d model into Xode via Reality Composer and Reality Converter. All is working fine apart from one thing...

I would like to apply a Black & White Filter to the camera feed in an AR session.

I found this link but it looks like the code is depracated...

https://github.com/BlackMirrorz/ARKitCameraFeedFilter

I also found this link in the developer docs which looks like it would do the trick but I have no idea how to implement it. (alos, it looks like its in Metal which is over my head at the moment!)

https://developer.apple.com/documentation/scenekit/scntechnique

Any help would be greatly appreciated

Hello @zissou,

There are a number of ways to accomplish this. Could you clarify, are you trying to filter only the camera feed (while still leaving virtual objects in color), or are you trying to filter both?

It doesn't matter as my 3d model has a greyscale texture. Whatever is easier to implement (please note: this is all very new to me so a walkthrough would be amazing!)

Hello @zissou_,

That definitely simplifies things if you can apply the filter to everything. In that case, and since it appears that you are using RealityKit, I recommend that you make use of the post-processing render callback and CoreImage to implement this, here is a short example:

import UIKit
import RealityKit
import CoreImage.CIFilterBuiltins

class ViewController: UIViewController {
    
    @IBOutlet var arView: ARView!
        
    // Initialize once and re-use to avoid expensive operations in the post-processing callback.
    let ciContext = CIContext()
    
    override func viewDidLoad() {
        super.viewDidLoad()
        
        // Load the "Box" scene from the "Experience" Reality File
        let boxAnchor = try! Experience.loadBox()
        
        // Add the box anchor to the scene
        arView.scene.anchors.append(boxAnchor)
        
        arView.renderCallbacks.postProcess = { [unowned self] postProcessingContext in
            
            // A filter that applies a mono style to an image.
            let monoFilter = CIFilter.photoEffectMono()
            
            // Make a CIImage from the rendered frame buffer.
            let source = CIImage(mtlTexture: postProcessingContext.sourceColorTexture)!
                .oriented(.downMirrored) // This orientation is essential to make sure that CoreImage interprets the texture contents correctly.
            
            // Set the source image as the input to the mono filter.
            monoFilter.inputImage = source
            
            // Request the filtered output image.
            let filteredSource = monoFilter.outputImage!
            
            // Render the filtered output image to the target color texture (this is the texture that ultimately gets displayed).
            do {
                let renderTask = try ciContext.startTask(toRender: filteredSource, to: .init(mtlTexture: postProcessingContext.targetColorTexture, commandBuffer: nil))
                
                // You must waitUntilCompleted here. RealityKit is expecting all post-processing work to be finished by the end of this closure.
                try renderTask.waitUntilCompleted()
            } catch {
                fatalError(error.localizedDescription)
            }
            
        }
    }
}

Yay!!! You saved the day ...and made my show :) https://www.instagram.com/stories/soundvisionlibrary/2984800947469031034/

ARKit Camera Feed to Black & White
 
 
Q