We have custom video composition that alters frames and everything works fine except on some videos [AVAsynchronousVideoCompositionRequest sourceFrameByTrackID] returns nil CVPixelBuffer. It happens only at specific times of that videos, usually at start, all other processing goes well and sourceFrameByTrackID returns normal buffers with pixel data in it. At the same this videos are totally ok and plays well in Photos app (for example).What reasons can cause such strange behaviour?
Post
Replies
Boosts
Views
Activity
I`m trying to create "virtual microphone", that should work "in front" of default input device/microphone. So when user select "virtual microphone" as input (in Audacity, for example) and starts to record sound - Audacity will receive samples from virtual device driver, that was taken by driver from real/default microphone. So "virtual microphone" is a kind of proxy device for real (default/built-in/etc) one. This is needed for later processing of microphone input on-the-fly.So far i created virtual HAL device (based on NullAudio.c driver example from Apple), i can generate procedural sound for Audacity, but i still can not figure out the way to read data from real microphone (using its deviceID) from inside driver.Is it ok to use normal recording as in usual app (via AudioUnits/AURemoteIO/AUHAL), etc? Or something like IOServices should be used?Documentation states thatAn AudioServerPlugIn operates in a limited environment. First and foremost, an AudioServerPlugIn may not make any calls to the client HAL API in the CoreAudio.framework. This will result in undefined (but generally bad) behavior.but it is not clear what API is "client" API and what is not, in regard of reading microphone data.What kind of API can/should be used from virtual device driver for accessing real microphone data in realtime?