Posts

Post not yet marked as solved
3 Replies
1.6k Views
I have an app which plays a video chosen from the user's library. What I intend for the app to be able to do eventually is to render an overlay onto the video (whilst it is playing) and then output the result to a new media file. In order to do this, I need to capture the decoded frames so that I can render this overlay and output to a file once the video playback has ended. This is my first app using AVFoundation and I have spent a day or two trying to find out how to achieve this through google, and Apple documentation, and I thought I had something in the AVPlayerItemVideoOutput object. However, the delegate callback is never executed. I discovered that AVPlayerItemVideoOutput must be created *after* the AVPlayerItem is in status readyToPlay. So, in the initialiser for my PlayerUIView I add an observer to the AVPlayerItem in order to watch its status.init(frame: CGRect, url: Binding<URL?>) { 				_url = url 				// Setup the player player = AVPlayer(url: url.wrappedValue!) super.init(frame: frame) playerLayer.player = player playerLayer.videoGravity = .resizeAspect layer.addSublayer(playerLayer) //displayLink = CADisplayLink() // Setup looping player.actionAtItemEnd = .none NotificationCenter.default.addObserver(self, selector: #selector(playerItemDidReachEnd(notification:)), name: .AVPlayerItemDidPlayToEndTime, object: player.currentItem) player.currentItem?.addObserver(self, forKeyPath: #keyPath(AVPlayerItem.status), options: [.old, .new], context: nil) // Start the movie player.play() } I have a CADisplayLink being created in the middle - commented out - because I saw that it could be used somehow for this but not exactly sure how or what it's supposed to do. Also concerned from the name that it gets frames from the displayed video rather than from the actual decoded video frames which is what I wanted. When the status is set to readyToPlay for the first time, I create and add the AVPlayerItemVideoOutput.override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) { 				if let item = object as? AVPlayerItem { if item.status == AVPlayerItem.Status.readyToPlay && item.outputs.count == 0 { let settings = [ String(kCVPixelBufferPixelFormatTypeKey): kCVPixelFormatType_24RGB ] let output = AVPlayerItemVideoOutput(pixelBufferAttributes: settings) output.setDelegate(PlayerOutput(output: output), queue: DispatchQueue(label: "")) player.currentItem?.add(output) } } } On the delegate PlayerOutput I am expecting to be notified when new frames are available. At which point I would access the AVPlayerItemVideoOutput object to access the pixel buffer.class PlayerOutput : NSObject, AVPlayerItemOutputPullDelegate { 				 				func outputMediaDataWillChange(_ sender: AVPlayerItemOutput) { 						let videoOutput = sender as! AVPlayerItemVideoOutput let newPixelBuff = videoOutput.hasNewPixelBuffer(forItemTime: CMTime(seconds: 1, preferredTimescale: 10)) } } However, this callback is never made. I set a breakpoint in the code and it is never hit. From the naming and similar code elsewhere in AVFoundation I assumed it would be hit for every new frame so I could access the frame in the buffer, but I'm not seeing anything happen. Is there something I am missing or doing wrong? I have a feeling I'm not quite using/understanding these classes right and what they are for, but it is similar in nomenclature to the AVCaptureVideoDataOutput etc. classes which I have managed to implement successfully elsewhere in the app, they just don't seem to work quite the same. Very difficult to find any examples doing what I want to do with the AVPlayer.
Posted
by LukeTim.
Last updated
.