Corrections: in the outputImage code, it is `return CIMetalRedColorKernel.kernel.apply(extent: dod, roiCallback: { index, rect in
return rect
}, arguments: [inputImage])
Post
Replies
Boosts
Views
Activity
@MediaEngineer. Filed bug FB9877123 with sample code with all info about reproducing, including demo video. As you might see in the sample code, the problem is not reproducible if you comment the following line in the code.
playerItem.seekingWaitsForVideoCompositionRendering = true
Please suggest appropriate workaround.
This has become rampant on iOS 15 devices as well, particularly iPhone X. Not sure what is the fix!
Did you manage to find this? I also need to know what kind of spline function CIToneCurve uses.
FB9709363 filed!
Well I meant editing a video rather than recording a video (in which case you get recommendedSettings from AVCaptureVideoDataOutput for writing using AVAssetWriter). If you want to edit a video using AVMutableComposition and exporting it using AVAssetReader and AVAssetWriter combination, then I am not sure what should be filled in compression dictionary (AVVideoCompressionPropertiesKey) for ProRes video.
@gchiste I already tried the following code but compiler gives errors saying it can not observe a class property.
AVCaptureDevice.observe(\.isCenterStageEnabled, options: [.new, .old]) { controller, value in
NSLog("Value \(value)")
}
Instance member 'observe' cannot be used on type 'AVCaptureDevice'; did you mean to use a value of this type instead?
Yes, it would be very helpful to know what has caused the error, and it took hours to figure out number of audio channels passed in audio compression settings were wrong.
Ok it works. The issue was I was modifying the info.plist file in the target which I opened by browsing the source tree. But when I selected the target in XCode and opened the info.plist by tapping the 'Info' tab in the Target, this info.plist is different. No idea how it happened. It may be a separate question altogether how to identify and find the real info.plist in use by XCode.
Size of the image is 800x800, but that's not the point I guess. The main goal is understanding how extent is influenced by transform and how exactly does Core Image Coordinate system work? Why the origin of new image formed by rotation negative and not (0,0)? Because without this understanding, it is impossible to develop anything serious and complex.
Looks like this bug still persists in iOS 15 as well. I see this when I have three videos with transition between first two. The returned frame on seek is incorrect, the only way to reset the player to display current frame is by firing AVPlayer.play(). I wonder how such basic bugs persist even after 10 years of AVFoundation framework release!
Dear Apple Engineers,
Can you please shed more light on this on why sourceFrame(byTrackId:) sometimes returns nil while seeking with AVPlayer seekTolerance set to .zero? Is it a bug? I have built my project using XCode 13 beta 5 but running it on iOS 14 device.
Even I have this issue.
Here is the sample code with which the bug can be reproduced. It works flawlessly on XCode 12 but not on XCode 13 beta
https://github.com/JoshuaSullivan/MetalKernel
I already filed a bug FB9393169 and the status is updated as "Resolution:Potential fix identified - In iOS 15". Not sure what this means. Because if this will be fixed in main iOS 15 release, then I don't need to do anything. But if it will not be fixed in iOS 15.0, then I will need to immediately update app and disable stereo capture else I risk lot of 1 star reviews.