1 Reply
      Latest reply on Nov 6, 2019 9:19 AM by dhoerl
      dhoerl Level 1 Level 1 (0 points)

        I have a .mov file that loops every 30 seconds, h@64 1920x1080, 25 fps. It has a black background that I'd like to make transparent - in fact I'd like every pixel to be transparent. So I created an AVPlayer to play it, but nothing I did made the video transparent. THen I tripped on a StackOverflow post that said I needed to add this setting:

         

        playerLayer.pixelBufferAttributes = [(kCVPixelBufferPixelFormatTypeKey as String): kCVPixelFormatType_32BGRA]

         

        Now, I could use a CIColorCube to make every black pixel transparent. Progress!

         

        But nothing I do lets me make all the pixels have some transparency. I made the background view black, and when I apply an alpha mask generated from the core image I get an image back, but the brightness is greatly reduced.

         

        lazy var alphaFilter: CIMaskToAlpha  = CIFilter.maskToAlpha()
        lazy var blendFilter: CIBlendWithMask = CIFilter.blendWithAlphaMask()
        
        playerItem.videoComposition = AVVideoComposition(asset: asset, applyingCIFiltersWithHandler: { request in
        
          let inputImage = request.sourceImage
          self.blendFilter.inputImage = inputImage
          self.alphaFilter.inputImage = inputImage
          self.blendFilter.maskImage = self.alphaFilter.outputImage!
          let output = self.blendFilter.outputImage!
        
          request.finish(with: output, context: nil)
        }

         

        I've searched high and low, and can't find anything on this. One post on StackOverflow said that the sourceImage is in sRGB format, so I tried to convert to linear before processing, then reverse it later, and the converse - everything I do just makes the output video look worse.

         

        I even tried my own CIFilter (in Metal), same result. I've looked at the video on the Simulator (Mojave) as well as on an iPad running 13.2, same appearance.

         

        Obviously I'm missing something but just don't know what!

         

        PS: the complete class:

         

        final class ViewController: UIViewController {
        
            var backgroundView: UIView!
            var player: AVQueuePlayer!
        
            var playerLooper: AVPlayerLooper!
            lazy var alphaFilter: CIMaskToAlpha  = CIFilter.maskToAlpha()
            lazy var blendFilter: CIBlendWithMask = CIFilter.blendWithAlphaMask()
        
            override func viewDidLoad() {
                super.viewDidLoad()
        
                self.view.backgroundColor = .black
        
                if let videoURL = Bundle.main.url(forResource: "seamlessly-looped-colored-flui_H264HD1080", withExtension: "mov") {
                    player = AVQueuePlayer()
                    let playerLayer = AVPlayerLayer(player: player)
        
                    let asset = AVAsset(url: videoURL)
                    let playerItem = AVPlayerItem(asset: asset)
        
                    playerItem.videoComposition = AVVideoComposition(asset: asset, applyingCIFiltersWithHandler: { request in
                        let inputImage = request.sourceImage
                        self.blendFilter.inputImage = inputImage
                        self.alphaFilter.inputImage = inputImage
                        self.blendFilter.maskImage = self.alphaFilter.outputImage!
                        let output = self.blendFilter.outputImage!
                        request.finish(with: output, context: nil)
                    })
        
                    let scale = CMTimeScale(1)
                    let timeRange = CMTimeRange(start: CMTime.zero, duration: CMTime(seconds: 30.0, preferredTimescale: scale))
                    playerLooper = AVPlayerLooper(player: player, templateItem: playerItem, timeRange: timeRange)
        
                    let vSize = CGSize(width: 1920/2, height: 1080/2)
                    backgroundView = UIView(frame: CGRect(origin: CGPoint(x: 0, y: 0), size: vSize))
                    backgroundView.backgroundColor =  .clear // .yellow
        
                    self.view.addSubview(backgroundView)
                    backgroundView.center = CGPoint(x: self.view.frame.midX, y: self.view.frame.midY)
        
                    playerLayer.frame = backgroundView.bounds //bounds of the view in which AVPlayer should be displayed
                    playerLayer.videoGravity = .resizeAspect
                    playerLayer.backgroundColor = UIColor.clear.cgColor // UIColor.red.cgColor
        
                    // stackoverflow.com: /a/39694351/1633251
                    playerLayer.pixelBufferAttributes = [(kCVPixelBufferPixelFormatTypeKey as String): kCVPixelFormatType_32BGRA]
                    self.backgroundView.layer.addSublayer(playerLayer)
                }
                player.play()
                player.rate = 1
            }
        }