@DTS Engineer I spent sometime and enabled concurrency checks to complete in the compiler options. I was able to fix two main sources of warning (which are error in Swift 6) by changing the definition of two functions as follows:
func capturePhoto(with features: PhotoFeatures, isolation: isolated (any Actor)? = #isolation) async throws -> Photo
func stopRecording(_ isolation: isolated (any Actor)? = #isolation) async throws -> Movie
Effectively, I forced these functions to be called from the isolation domain of an Actor (which is actor CaptureService in the source code). I believe this should be okay as this is the only place where these functions are invoked.
However, there is one place in the code where I still need help:
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey: Any]?, context: UnsafeMutableRawPointer?) {
switch keyPath {
case systemPreferredKeyPath:
// Update the observer's system-preferred camera value.
let newDevice = change?[.newKey] as? AVCaptureDevice
continuation?.yield(newDevice)
default:
super.observeValue(forKeyPath: keyPath, of: object, change: change, context: context)
}
}
The line continuation?.yield(newDevice) throws a warning that Task-isolated 'newDevice' is passed as a 'sending' parameter; Uses in callee may race with later task-isolated uses.
Is this because AVCaptureDevice is not Sendable?
Post
Replies
Boosts
Views
Activity
That requires modifying hundreds of CI kernel methods, not a good workaround.
Dear Apple Engineers/DTS,
Is there any answer for this question?
I am facing the same issue, no idea how to fix it.
Still waiting for an answer here.
Here is the link to download the test project from Github to reproduce the issue. Feel free to point out what is wrong and needs to be fixed.
@galad87 The plist itself is type CFData in this case. Nothing works as I said, including the second code sample I posted in the question where I simply copy and paste the attachment from source sample buffer. Feel free to modify and post the sample code you think should work.
@Jason It seems AVAssetWriter is crashing with error -12743 when appending NSData for kCVImageBufferAmbientViewingEnvironmentKey. Please describe how exactly to copy this attachment to new pixel buffer. Here is my code:
var ambientViewingEnvironment:CMFormatDescription.Extensions.Value?
var ambientViewingEnvironmentData:NSData?
ambientViewingEnvironment = sampleBuffer.formatDescription?.extensions[.ambientViewingEnvironment]
let plist = ambientViewingEnvironment?.propertyListRepresentation
ambientViewingEnvironmentData = plist as? NSData
And then attaching this data,
CVBufferSetAttachment(renderedPixelBuffer, kCVImageBufferAmbientViewingEnvironmentKey, ambientViewingEnvironmentData! as CFData, .shouldPropagate)
Dear Core Image Engineering Team,
I wasted a DTS credit to get this answered and all I was advised was to debug using CIPrintTree and look for "optimisations" where there is none possible (the code I submitted with the bug report contained minimal CIImage rendering pipeline). The DTS engineer acknowledged the significant difference in CPU usage between Metal and Core Image pipeline but was unable to offer anything except recommending filing a bug report. I have already filed FB12619176 and looking to hear directly from Core Image Engineering team. Because high CPU usage is a no-no for using Core Image for video pipelines. It might be okay for photo editing but not for live video or video editing tasks.
I tried setting activeColorSpace to .appleLog whenever the device.activeformat.colorSpaces contain .appleLog. But somehow it doesn't works.
I adopted Core Image instead of Metal as it was recommended by Core Image team in various WWDC videos such as wwdc2020-10008 but only to realize that my UI is getting slower due to very high CPU usage by Core Image (as much as 3x-4x compared to raw Metal). It seems I need to rewrite the entire filter chain using plain Metal unless there is a workaround for high CPU usage. I have submitted a feedback request to DTS with a fully reproducible sample code.
Really looking forward to answers by Core Image team here.
Can someone from Apple please respond to this?
Same issue with XCode 14.3.1 and Cocoapods 1.12.2
How to attach ambientViewingEnvironment to new pixel buffer as attachment? I tried many things but it fails.
Update: It seems the attachment needs to be copied as NSData, example
CVBufferSetAttachment(pixelBuffer, kCVImageBufferAmbientViewingEnvironmentKey, ambientViewingEnvironmentData! as NSData, .shouldPropagate)
Getting the same error on Apple Watch SE and iPhone 14 Pro.