AVPlayerItemVideoOutput stops delivering frames after seek with video composition

I want to play a video and process it using Core Image filters. I have the following setup:
  • the video as an AVAsset

  • an AVPlayer for controlling the video playback

  • the AVAsset is passed via AVPlayerItem to the player

  • the Core Image filters are applied using an AVMutableVideoComposition that was initialized with init(asset:applyingCIFiltersWithHandler:) and assigned to the player item

  • an AVPlayerItemVideoOutput is used to get the processed frames using copyPixelBuffer(forItemTime:itemTimeForDisplay:)

This works as expected. However, I'm observing strange behavior when seeking backwards on the player using seek(to:completionHandler:): the AVPlayerItemVideoOutput suddenly stops delivering new pixel buffers and hasNewPixelBuffer(forItemTime:) returns false. But this only happens when the filters applied in the video composition are a bit more expensive. When using a very simple filter or no filter at all, seeking works as expected.

Inspired by this technical note I found a workaround be re-assigning the composition to the player item after the seek finished, but this feels very hacky—especially since it's working for simple filter pipelines.

Did anybody encounter this before? Is there anything I need to configure differently?
Please report this behavior via the Feedback Assistant and include a project we can build and run to reproduce the issue.
AVPlayerItemVideoOutput stops delivering frames after seek with video composition
 
 
Q