Video: applying CIFilters in an AVAsynchronousCIImageFilteringRequest block are muddled

I have a .mov file that loops every 30 seconds, h@64 1920x1080, 25 fps. It has a black background that I'd like to make transparent - in fact I'd like every pixel to be transparent. So I created an AVPlayer to play it, but nothing I did made the video transparent. THen I tripped on a StackOverflow post that said I needed to add this setting:


playerLayer.pixelBufferAttributes = [(kCVPixelBufferPixelFormatTypeKey as String): kCVPixelFormatType_32BGRA]


Now, I could use a CIColorCube to make every black pixel transparent. Progress!


But nothing I do lets me make all the pixels have some transparency. I made the background view black, and when I apply an alpha mask generated from the core image I get an image back, but the brightness is greatly reduced.


lazy var alphaFilter: CIMaskToAlpha  = CIFilter.maskToAlpha()
lazy var blendFilter: CIBlendWithMask = CIFilter.blendWithAlphaMask()

playerItem.videoComposition = AVVideoComposition(asset: asset, applyingCIFiltersWithHandler: { request in

  let inputImage = request.sourceImage
  self.blendFilter.inputImage = inputImage
  self.alphaFilter.inputImage = inputImage
  self.blendFilter.maskImage = self.alphaFilter.outputImage!
  let output = self.blendFilter.outputImage!

  request.finish(with: output, context: nil)
}


I've searched high and low, and can't find anything on this. One post on StackOverflow said that the sourceImage is in sRGB format, so I tried to convert to linear before processing, then reverse it later, and the converse - everything I do just makes the output video look worse.


I even tried my own CIFilter (in Metal), same result. I've looked at the video on the Simulator (Mojave) as well as on an iPad running 13.2, same appearance.


Obviously I'm missing something but just don't know what!


PS: the complete class:


final class ViewController: UIViewController {

    var backgroundView: UIView!
    var player: AVQueuePlayer!

    var playerLooper: AVPlayerLooper!
    lazy var alphaFilter: CIMaskToAlpha  = CIFilter.maskToAlpha()
    lazy var blendFilter: CIBlendWithMask = CIFilter.blendWithAlphaMask()

    override func viewDidLoad() {
        super.viewDidLoad()

        self.view.backgroundColor = .black

        if let videoURL = Bundle.main.url(forResource: "seamlessly-looped-colored-flui_H264HD1080", withExtension: "mov") {
            player = AVQueuePlayer()
            let playerLayer = AVPlayerLayer(player: player)

            let asset = AVAsset(url: videoURL)
            let playerItem = AVPlayerItem(asset: asset)

            playerItem.videoComposition = AVVideoComposition(asset: asset, applyingCIFiltersWithHandler: { request in
                let inputImage = request.sourceImage
                self.blendFilter.inputImage = inputImage
                self.alphaFilter.inputImage = inputImage
                self.blendFilter.maskImage = self.alphaFilter.outputImage!
                let output = self.blendFilter.outputImage!
                request.finish(with: output, context: nil)
            })

            let scale = CMTimeScale(1)
            let timeRange = CMTimeRange(start: CMTime.zero, duration: CMTime(seconds: 30.0, preferredTimescale: scale))
            playerLooper = AVPlayerLooper(player: player, templateItem: playerItem, timeRange: timeRange)

            let vSize = CGSize(width: 1920/2, height: 1080/2)
            backgroundView = UIView(frame: CGRect(origin: CGPoint(x: 0, y: 0), size: vSize))
            backgroundView.backgroundColor =  .clear // .yellow

            self.view.addSubview(backgroundView)
            backgroundView.center = CGPoint(x: self.view.frame.midX, y: self.view.frame.midY)

            playerLayer.frame = backgroundView.bounds //bounds of the view in which AVPlayer should be displayed
            playerLayer.videoGravity = .resizeAspect
            playerLayer.backgroundColor = UIColor.clear.cgColor // UIColor.red.cgColor

            // stackoverflow.com: /a/39694351/1633251
            playerLayer.pixelBufferAttributes = [(kCVPixelBufferPixelFormatTypeKey as String): kCVPixelFormatType_32BGRA]
            self.backgroundView.layer.addSublayer(playerLayer)
        }
        player.play()
        player.rate = 1
    }
}

Accepted Reply

Two updates:

- I realized that this reduces alpha doubly - the original pixels have a fractional alpha, which is then multiplied by the alpha in the mask.


- I was able to do what I wanted by writing my own CIImageFilter in MTL, and converting the pixels from RGBA with black mixed in, to transparent RGBA that when drawn over a black background appears exactly the same.

Replies

Two updates:

- I realized that this reduces alpha doubly - the original pixels have a fractional alpha, which is then multiplied by the alpha in the mask.


- I was able to do what I wanted by writing my own CIImageFilter in MTL, and converting the pixels from RGBA with black mixed in, to transparent RGBA that when drawn over a black background appears exactly the same.