HLS HEVC+Alpha doesn't work as SceneKit material

Hi, when I use a local .mp4 video file encoded in HEVC + Alpha channel with an AVPlayer as the material of a SCNNode, the transparency is restituted correctly similarly as if I used a .png image with transparency.

The issue is: when I encode this same .mp4 file into a HLS stream using mediafilesegmenter and try to play it in the same manner as a SCNNode material with AVPlayer the transparency is not restituted and instead the transparent zones are filled with opaque black. (the hls stream has correct transparency as verified by opening its url with Safari)

Sample Test:

import UIKit
import ARKit

class ViewController: UIViewController {
        private var arView: ARSCNView!
        lazy var sphere: SCNNode = {
        let node = SCNSphere(radius: 5)
        node.isGeodesic = false
        node.segmentCount = 64
        node.firstMaterial?.lightingModel = .constant
        node.firstMaterial?.diffuse.contents =  colorLiteral(red: 0, green: 0, blue: 0, alpha: 0)
        node.firstMaterial?.cullMode = .front
        return SCNNode(geometry: node)
    }()
    
    private var avPlayer: AVPlayer!
    
    override func viewDidLoad() {
        super.viewDidLoad()
        setupArView()
        setupArSession()
        setupButton()
    }

    private func setupButton() {
        let button = UIButton()
        button.setTitle("START", for: .normal)
        button.translatesAutoresizingMaskIntoConstraints = false
        view.addSubview(button)
        NSLayoutConstraint.activate([
            button.centerXAnchor.constraint(equalTo: view.centerXAnchor),
            button.centerYAnchor.constraint(equalTo: view.centerYAnchor)
        ])
        button.addTarget(self, action: #selector(createSphere), for: .touchUpInside)
    }
    
    @IBAction func createSphere() {
        guard avPlayer == nil else { return }
        addSphere()
    }
}

extension ViewController {
    private func setupArView() {
        arView = ARSCNView()
        arView.backgroundColor = .black
        arView.translatesAutoresizingMaskIntoConstraints = false
        view.insertSubview(arView, at: 0)
        NSLayoutConstraint.activate([
            arView.leadingAnchor.constraint(equalTo: view.leadingAnchor),
            arView.topAnchor.constraint(equalTo: view.topAnchor),
            arView.trailingAnchor.constraint(equalTo: view.trailingAnchor),
            arView.bottomAnchor.constraint(equalTo: view.bottomAnchor)
        ])
        arView.preferredFramesPerSecond = 60
    }

    private func setupArSession() {
        let configuration = ARWorldTrackingConfiguration()
        configuration.worldAlignment = .gravityAndHeading
        configuration.environmentTexturing = .none
        if ARWorldTrackingConfiguration.supportsFrameSemantics(.personSegmentationWithDepth) {
    configuration.frameSemantics.insert(.personSegmentationWithDepth)
        }
        if ARWorldTrackingConfiguration.supportsUserFaceTracking {
            configuration.userFaceTrackingEnabled = true
        }
        arView.session.run(configuration)
    }

    private func addSphere() {
//        let asset = AVURLAsset(url: URL(string: "https://SOMECLOUDSTORAGE.com/hls-bug/prog_index.m3u8")!)
        let asset = AVURLAsset(url: Bundle.main.url(forResource: "puppets", withExtension: "mp4")!)
        let playerItem = AVPlayerItem(asset: asset)
        playerItem.addObserver(self, forKeyPath: #keyPath(AVPlayerItem.status), options: [.old, .new], context: nil)
        playerItem.isAudioSpatializationAllowed = true
        playerItem.allowedAudioSpatializationFormats = .monoStereoAndMultichannel
        avPlayer = AVPlayer()
        sphere.position = SCNVector3(0, 0, 0)
        arView.scene.rootNode.addChildNode(sphere)
        avPlayer.replaceCurrentItem(with: playerItem)
    }

    override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
        if keyPath == #keyPath(AVPlayerItem.status) {
            let status: AVPlayerItem.Status
            if let statusNumber = change?[.newKey] as? NSNumber {
                status = AVPlayerItem.Status(rawValue: statusNumber.intValue)!
            } else {
                status = .unknown
            }

            switch status {
            case .readyToPlay:
                DispatchQueue.main.async {
                    self.avPlayer.playImmediately(atRate: 1)
                    self.sphere.geometry?.firstMaterial?.diffuse.contents = self.avPlayer
                }
            case .failed, .unknown:
                break
            @unknown default:
                break
            }
        } else {
            super.observeValue(forKeyPath: keyPath, of: object, change: change, context: context)
        }
    }
}

The video file used is the puppets_with_alpha_hevc.mov file from the Apple's HEVC alpha demo that I re-muxed into .mp4 container using ffmpeg.

To reproduce both scenarios replace the AVURLAsset with either a local .mp4 file or the HLS stream url.

Issue reproduced on iPhone 11 Pro iOS 15.

This issue has been unresolved for a bit of time now though I tried everything to get attention. Unsuccessful TSI ticket, silent Feedback Assistant Bug report, I even discussed about this bug during WWDC 2021 with Shiva Sundar who is in charge of HEVC dev and who said it would be checked.

Hopes

Also I can confirm that HEVC + alpha channel HLS stream works as expected in SpriteKit as SKVideoNode despite not working in SceneKit as SCNMaterial

One dirty workaround is to use a SKVideoNode embedded in a SKScene as a SCNMaterial. However this is not a suitable solution as it loads the CPU with additional 30-40% on iPhone 11 Pro making my ARKit application well above 100% CPU while the AVPlayer SCNMaterial was very CPU efficient...

Reproduced on 15.2.1

HLS HEVC+Alpha doesn't work as SceneKit material
 
 
Q