I want to play a video and process it using Core Image filters. I have the following setup:
Inspired by this technical note I found a workaround be re-assigning the composition to the player item after the seek finished, but this feels very hacky—especially since it's working for simple filter pipelines.
Did anybody encounter this before? Is there anything I need to configure differently?
the video as an AVAsset
an AVPlayer for controlling the video playback
the AVAsset is passed via AVPlayerItem to the player
the Core Image filters are applied using an AVMutableVideoComposition that was initialized with init(asset:applyingCIFiltersWithHandler:) and assigned to the player item
an AVPlayerItemVideoOutput is used to get the processed frames using copyPixelBuffer(forItemTime:itemTimeForDisplay:)
Inspired by this technical note I found a workaround be re-assigning the composition to the player item after the seek finished, but this feels very hacky—especially since it's working for simple filter pipelines.
Did anybody encounter this before? Is there anything I need to configure differently?