how to produce image with semanticSegmentationSkyMatteImage information on iOS?

I'm trying to create a sky mask on pictures taken from my iPhone. I've seen in the documentation that CoreImage support semantic segmentation for Sky among other type for person (skin, hair etc...)

For now, I didn't found the proper workflow to use it.

First, I watched https://developer.apple.com/videos/play/wwdc2019/225/

I understood that images must be captured with the segmentation with this kind of code:

photoSettings.enabledSemanticSegmentationMatteTypes = self.photoOutput.availableSemanticSegmentationMatteTypes
photoSettings.embedsSemanticSegmentationMattesInPhoto = true

I capture the image on my iPhone, save it as HEIC format then later, I try to load the matte like that :

let skyMatte = CIImage(contentsOf: imageURL, options: [.auxiliarySemanticSegmentationSkyMatte: true])

Unfortunately, self.photoOutput.availableSemanticSegmentationMatteTypes always give me a list of types for person only and never a types Sky.

Anyway, the AVSemanticSegmentationMatte.MatteType is just [Hair, Skin, Teeth, Glasses] ... No Sky !!!

So, How am I supposed to use semanticSegmentationSkyMatteImage ?!? Is there any simple workaround ?

Replies

I get an answer on LinkedIn and it seems it's not possible without private libraries.