How to convert local video into CMSampleBuffer for unit tests

Hi,

I'm working on an app that uses the live camera output to perform text recognition.

In order to improve the performance , I want to write some unit tests to check how well the text recognition works over time. I want to load a video from my app bundle, and use that as the 'camera output' so that I can write tests around what should and what should not get recognized.

I'm trying to find out how to generate a CMSampleBuffer from a video so I can feed that into my Vision request. Any ideas? Thanks!
Answered by Media Engineer in 617315022
Another option, if you don't need the frames in real-time, is to use AVAssetReader.
Take a look at Action & Vision sample code. The CameraViewController class supports two modes: live camera capture and reading form the video file.
The idea is to use AVPlayerItemVideoOutput together with CADisplayLink - this will closely simulate the live camera capture scenario.
The actual conversion of CVPixelBuffer to CMSampleBuffer happens inside CADisplayLink callback.
See handleDisplayLink(_ displayLink: CADisplayLink) function.


Accepted Answer
Another option, if you don't need the frames in real-time, is to use AVAssetReader.
Thanks to both of you for the suggestions! Got a first version up and running!
Oops, I wanted to choose the other answer as 'the answer'. Can't undo it now, sorry!
Could you, please, share how you finally mocked buffers in tests? Have you used handleDisplayLink or AVAssetReader?
How to convert local video into CMSampleBuffer for unit tests
 
 
Q