Posts

Post not yet marked as solved
1 Replies
Still waiting for an answer here.
Post marked as solved
2 Replies
Here is the link to download the test project from Github to reproduce the issue. Feel free to point out what is wrong and needs to be fixed.
Post not yet marked as solved
2 Replies
@galad87 The plist itself is type CFData in this case. Nothing works as I said, including the second code sample I posted in the question where I simply copy and paste the attachment from source sample buffer. Feel free to modify and post the sample code you think should work.
Post not yet marked as solved
2 Replies
@Jason It seems AVAssetWriter is crashing with error -12743 when appending NSData for kCVImageBufferAmbientViewingEnvironmentKey. Please describe how exactly to copy this attachment to new pixel buffer. Here is my code: var ambientViewingEnvironment:CMFormatDescription.Extensions.Value? var ambientViewingEnvironmentData:NSData? ambientViewingEnvironment = sampleBuffer.formatDescription?.extensions[.ambientViewingEnvironment] let plist = ambientViewingEnvironment?.propertyListRepresentation ambientViewingEnvironmentData = plist as? NSData And then attaching this data, CVBufferSetAttachment(renderedPixelBuffer, kCVImageBufferAmbientViewingEnvironmentKey, ambientViewingEnvironmentData! as CFData, .shouldPropagate)
Post not yet marked as solved
4 Replies
Dear Core Image Engineering Team, I wasted a DTS credit to get this answered and all I was advised was to debug using CIPrintTree and look for "optimisations" where there is none possible (the code I submitted with the bug report contained minimal CIImage rendering pipeline). The DTS engineer acknowledged the significant difference in CPU usage between Metal and Core Image pipeline but was unable to offer anything except recommending filing a bug report. I have already filed FB12619176 and looking to hear directly from Core Image Engineering team. Because high CPU usage is a no-no for using Core Image for video pipelines. It might be okay for photo editing but not for live video or video editing tasks.
Post not yet marked as solved
1 Replies
I tried setting activeColorSpace to .appleLog whenever the device.activeformat.colorSpaces contain .appleLog. But somehow it doesn't works.
Post not yet marked as solved
4 Replies
I adopted Core Image instead of Metal as it was recommended by Core Image team in various WWDC videos such as wwdc2020-10008 but only to realize that my UI is getting slower due to very high CPU usage by Core Image (as much as 3x-4x compared to raw Metal). It seems I need to rewrite the entire filter chain using plain Metal unless there is a workaround for high CPU usage. I have submitted a feedback request to DTS with a fully reproducible sample code. Really looking forward to answers by Core Image team here.
Post not yet marked as solved
4 Replies
Can someone from Apple please respond to this?
Post not yet marked as solved
2 Replies
How to attach ambientViewingEnvironment to new pixel buffer as attachment? I tried many things but it fails. Update: It seems the attachment needs to be copied as NSData, example CVBufferSetAttachment(pixelBuffer, kCVImageBufferAmbientViewingEnvironmentKey, ambientViewingEnvironmentData! as NSData, .shouldPropagate)
Post not yet marked as solved
27 Replies
Getting the same error on Apple Watch SE and iPhone 14 Pro.
Post not yet marked as solved
1 Replies
Dear @AVFoundation Engineers, Any inputs on this? Still waiting to hear.
Post not yet marked as solved
1 Replies
This again turns out to be another bug in iOS 16.0 where sample buffers received via AVCaptureAudioDataOutput continue to have stale/incorrect formatDescription when external mic is inserted.
Post not yet marked as solved
4 Replies
@UIKit Engineers We desperately need your help here. We have to use lot of workarounds(which are not good) to cover up this iOS 16 bug. As of now I tried to introduce a delay of 0.5 seconds to invoke autorotation which works on iPhone 13 pro at the very least. However this introduces an extra latency in app startup and at the same time is not guaranteed to work on every iOS device (particularly the slow ones like iPhone X or older). Please help us. func autoRotateNotification() { DispatchQueue.main.asyncAfter(deadline: .now() + 0.5, execute: { /* * HELP::: This code does something only when debug directly from XCode, * not when directly launching the app on device!!!! */ self.disableAutoRotation = false if #available(iOS 16.0, *) { UIView.performWithoutAnimation { self.setNeedsUpdateOfSupportedInterfaceOrientations() } } else { // Fallback on earlier versions UIViewController.attemptRotationToDeviceOrientation() } }) }