That requires modifying hundreds of CI kernel methods, not a good workaround.
Post
Replies
Boosts
Views
Activity
Dear Apple Engineers/DTS,
Is there any answer for this question?
I am facing the same issue, no idea how to fix it.
Still waiting for an answer here.
Here is the link to download the test project from Github to reproduce the issue. Feel free to point out what is wrong and needs to be fixed.
@galad87 The plist itself is type CFData in this case. Nothing works as I said, including the second code sample I posted in the question where I simply copy and paste the attachment from source sample buffer. Feel free to modify and post the sample code you think should work.
@Jason It seems AVAssetWriter is crashing with error -12743 when appending NSData for kCVImageBufferAmbientViewingEnvironmentKey. Please describe how exactly to copy this attachment to new pixel buffer. Here is my code:
var ambientViewingEnvironment:CMFormatDescription.Extensions.Value?
var ambientViewingEnvironmentData:NSData?
ambientViewingEnvironment = sampleBuffer.formatDescription?.extensions[.ambientViewingEnvironment]
let plist = ambientViewingEnvironment?.propertyListRepresentation
ambientViewingEnvironmentData = plist as? NSData
And then attaching this data,
CVBufferSetAttachment(renderedPixelBuffer, kCVImageBufferAmbientViewingEnvironmentKey, ambientViewingEnvironmentData! as CFData, .shouldPropagate)
Dear Core Image Engineering Team,
I wasted a DTS credit to get this answered and all I was advised was to debug using CIPrintTree and look for "optimisations" where there is none possible (the code I submitted with the bug report contained minimal CIImage rendering pipeline). The DTS engineer acknowledged the significant difference in CPU usage between Metal and Core Image pipeline but was unable to offer anything except recommending filing a bug report. I have already filed FB12619176 and looking to hear directly from Core Image Engineering team. Because high CPU usage is a no-no for using Core Image for video pipelines. It might be okay for photo editing but not for live video or video editing tasks.
I tried setting activeColorSpace to .appleLog whenever the device.activeformat.colorSpaces contain .appleLog. But somehow it doesn't works.
I adopted Core Image instead of Metal as it was recommended by Core Image team in various WWDC videos such as wwdc2020-10008 but only to realize that my UI is getting slower due to very high CPU usage by Core Image (as much as 3x-4x compared to raw Metal). It seems I need to rewrite the entire filter chain using plain Metal unless there is a workaround for high CPU usage. I have submitted a feedback request to DTS with a fully reproducible sample code.
Really looking forward to answers by Core Image team here.
Can someone from Apple please respond to this?
Same issue with XCode 14.3.1 and Cocoapods 1.12.2
How to attach ambientViewingEnvironment to new pixel buffer as attachment? I tried many things but it fails.
Update: It seems the attachment needs to be copied as NSData, example
CVBufferSetAttachment(pixelBuffer, kCVImageBufferAmbientViewingEnvironmentKey, ambientViewingEnvironmentData! as NSData, .shouldPropagate)
Getting the same error on Apple Watch SE and iPhone 14 Pro.
Dear @AVFoundation Engineers,
Any inputs on this? Still waiting to hear.