I'm appending CVPixelBufferRefs to an AVAssetWriterInputPixelBufferAdaptor which is connected to an AVAssetWriterInput to write to a file using AVAssetWriter.
So, I'm calling
[pixelAdaptor appendPixelBuffer:pxbuffer withPresentationTime:someCMTime];
and it works as it should.
Now, I'm doing this sequentially, so at CMTimeZero, I add the first frame, at time 0.5s I append another frame, at 1s another, and so on, and so on.
I'm wondering, do I have to do it sequentially, or can I do it randomly as well? For the sake of argument, the other way around? Like, append a buffer at 1s first, then at 0.5s and then at 0s?
The reason I'm asking is, I've tried, and it crashes, but I'm not sure if it crashes because I tried to append at random times (non-sequentially), or if there's another reason (like asynchronous appending)?
I'm only appending when the input tells me to (readyForMoreMediaData), on a serial dispatch queue.
The question is: Should non-sequential appending work, or is it by design that it doesn't?
Thank you,
Matthias
Post
Replies
Boosts
Views
Activity
Hi,
when I try to push to my git repository using Xcode 12 beta, I get the following error:
An unknown error occurred
username does not match previous requests (-1)
It works correctly using Xcode 11, using the same copy / files on disk.
Is this a known issue?
Thank you,
Matthias
The LinkPresentation framework, specifically using LPMetadataProvider, makes my app crash after calling -startFetchingMetadataForURL:completionHandler: and before calling the completionHandler, suggesting it's a bug within the framework, not my app.Crashes in ***:WebKit something on macOS. On iOS, the framework performs nicely.
Hello, everyone.My name is Matt, I'm the developer of Eternal Storms Software.I recently released a freeware app: SiriMote, outside of the Mac App Store.The reason I could not release it on the Mac App Store is that it uses CGEventPost to simulate keypresses (for example, when the play/pause button is pressed on the Siri Remote, the Play/Pause media key (on the F8 key) of the Mac's keyboard is pressed), and CGEventPost is ignored inside the sandbox.So I was wondering - is there a way to do this inside the sandbox?I guess an alternative would be using the Scripting Bridge, but then I would have to specifically communicate with particular apps. The nice thing about CGEventPost is that any app that responds to the media keys can be used with SiriMote.I'd love a more open approach, like CGEventPost.Any hints appreciated!Thank you kindly,Matt