Removing image backgrounds in iOS 16 and MacOS 13

Is this accessible from Swift directly?

Visual Look Up Lift subject from background

Lift the subject from an image or isolate the subject by removing the background. This works in Photos, Screenshot, Quick Look, Safari, and more.

Source: macOS Ventura Preview - New Features - Apple

I see that Shortcuts now has a native Remove Background command that wasn't there in iOS 25 or MacOS 12. Is there any way to call that from Swift besides x-callback url schemes?

I'm keen to work this out too. I don't think I'd like to use the x-callback way. I've looked into trying VNGenerateObjectnessBasedSaliencyImageRequest, but it's only a 64x64 sized mask that gets returned and even still does not produce the same results.

The only way I can think of getting the result without leaving your app is to present your image from a WKWebView which allows using the "select subject" in the context menu, not perfect, but a possible workaround while we waiting for a more direct solution - I'm going to look into [https://developer.apple.com/documentation/webkit/viewing_desktop_or_mobile_web_content_using_a_web_view) to present my image. Is there a better way?? Not sure, can't find anything out there yet.

I am really hoping we will see something in this space at WWDC. The silence is deafening from Apple on using this feature programatically. Apple please let us know!

I am really hoping we will see something in this space at WWDC. The silence is deafening from Apple on using this feature programatically. Apple please let us know!

Seriously this is all I want from WWDC this year. Alternatively maybe the developers of Pixelmator Pro can be persuaded to open-source or license their implementation of this feature....

In iOS 17 you get great results with VNGenerateForegroundInstanceMaskRequest() - make your image a CVPixelBuffer, then feed it into something like this:

var maskRequest = VNGenerateForegroundInstanceMaskRequest()

    let handler = VNImageRequestHandler(cvPixelBuffer: pixelBuffer, options: [:])
    do {
        try handler.perform([maskRequest])
        if let observation = maskRequest.results?.first {
            let allInstances = observation.allInstances
            do {
                let maskedImage = try observation.generateMaskedImage(ofInstances: allInstances, from: handler, croppedToInstancesExtent: false)
                let maskImage = imageFromCVPixelBuffer(maskedImage)
                sceneView.scene.background.contents = maskImage
            } catch {
                print("Error: \(error.localizedDescription)")
            }
        }
    } catch {
        print("Failed to perform Vision request: \(error)")
    }
    

Thanks - I'm busy converting a project from CoreData to SwiftData, then I'm going to tackle this next -- needless to say, I was thrilled to see they've added it, and glad to hear you've got it working.

Removing image backgrounds in iOS 16 and MacOS 13
 
 
Q