I don't exactly know why you are getting all zeros here, but there is a much simpler way to access the data using CIContext.render(_:toBitmap:...).
Please check out the implementation - https://github.com/DigitalMasterpieces/CoreImageExtensions/blob/main/Sources/CIContext%2BValueAccess.swift in this small helper package - https://github.com/DigitalMasterpieces/CoreImageExtensions I wrote that contains useful Core Image extensions.
Post
Replies
Boosts
Views
Activity
Just wanted to cross-reference StackOverflow - https://stackoverflow.com/questions/66914332/in-swift-filter-ciareaminmax-provide-incorrect-output here.
In your kernel, colors are usually normalized to [0.0 ... 1.0], based on the underlying color space. So even if values are stored in 10-bit inters in a texture, your shader will get them as normalized floats.
I emphasized the color space above because it is used when translating the colors from the source into those normalized values. When you are using the default sRGB color space, the wide gamut from the HDR source doesn't fit into the sRGB [0.0 ... 1.0] spectrum. That's why you may get values outside that range in your kernel. This is actually useful in most cases because most filter operations that are designed for sRGB still work then. The color invert example above, however, is not.
You have two options here that I know of:
You can change the workingColorSpace of the CIContext you are using to the HDR color space of the input: let ciContext = CIContext(options: [.workingColorSpace: CGColorSpace(name: CGColorSpace.itur_2020)!])
Then all color values should be capped to [0.0 ... 1.0] in your kernel, where 0.0 is the darkest HDR color value and 1.0 is the brightest. You can safely perform the inversion with 1.0 - x then. However, keep in mind that some other filters will then not produce the correct result because they assume the input to be in (linear) sRGB—Core Image's default.
The second option is that you convert ("color match") the input into the correct color space before passing it into your kernel and back to working space again before returning: return kernelOutput.matchedToWorkingSpace(from: colorSpace)
@IanOllmann There seems to be a bug in the AppleEXR encoder, causing a BAD_ACCESS when encoding images with height 16.
Could you please have a look? (FB9080694)
Thanks!
I'm not sure if you are able to access sub-images using NSImage. However, you should be able to do so with a CGImageSource:
let source = CGImageSourceCreateWithURL(newURL, nil)
let numSubImages = CGImageSourceGetCount(source)
for i in 0..<numSubImages {
		let subImage = CGImageSourceCreateImageAtIndex(source, i, nil)
		// subImage is a CGImage, you can convert it to an NSImage if you prefer:
		let nsImage = NSImage(cgImage: subImage, size: NSZeroSize)
		// handle image...
}
Have you tried setting the magnificationFilter of the view's layer to nearest?
Do built-in filters like CIGaussianBlur still work?
I encountered a very similar bug and I found a workaround. Can you try context.heifRepresentation(of: image.settingAlphaOne(in: image.extent), …) and see if that works?
Found it:
I'm getting the attachments (metadata) of the incoming sample buffers and accidentally leaked the dictionary due to wrong bridging.
So instead of this
NSDictionary* attachments = (_bridge NSDictionary* Nullable)CMCopyDictionaryOfAttachments(NULL, sampleBuffer, kCMAttachmentModeShouldPropagate);
I should have done this
NSDictionary* attachments = (bridgetransfer NSDictionary* Nullable)CMCopyDictionaryOfAttachments(NULL, sampleBuffer, kCMAttachmentModeShouldPropagate);
Interestingly, this leak caused the capture session to stop delivering new sample buffers after 126 frames—without any warning, error, or notification.
A CIFilter is usually just a very small wrapper around a CIKernel, which is basically a bit of (intermediate) code that can be run on the GPU. When you apply a filter (chain) Core Image will optimize, potentially concatenate and finally compile the code and send it to the GPU. The Core Image runtime will probably perform some caching of those kernels, but I'm not sure.
To summarize, CIFilters should have a very small memory footprint and you probably benefit more from caching when you re-use them instead of instantiating them on-the-fly.
I'm afraid the front camera doesn't support capturing in RAW.
Interesting, I never saw this type. Seems to be an old name. It seems you should just us PKDrawing instead now.
Could you please post some code? How do you set up the CGContext?
Maybe also consider using the (newish) UIGraphicsImageRenderer API instead of setting up CGContext manually. This should be more reliable with consistent results across different devices.
(And please also tag [Core Graphics] in you question instead of [Core Image].)
You are right. I decided to use newest SDKs and SwiftUI in order to learn how to best integrate Core Image workflows with them.
And yes, it should work on a Mac, but that needs to run Big Sur and I haven't tested it yet. I tested on my iPad with iOS 14. Will check macOS soon.
However, all the relevant APIs (especially MTKView) have been there for a while and should work the same way in older versions and in Objective-C. The important part is the setup of the MTKView and the draw method. If you follow this path, you should be good:
AVCaptureDeviceInput → AVCaptureVideoDataOutput → AVCaptureVideoDataOutputSampleBufferDelegate → CVPixelBuffer → CIImage → applying CIFilters → CIImage → render into MTKView using a CIContext.
Looking forward to your report! 🙂
Hi Kent,
The filters you can add to a CALayer will apply to the layer's content, so they can't be used to display your own content (your CIImage) on the layer.
Instead, you could either convert your CIImage into a UIImage and display that in a UIImageView. But for a video stream this is not the best approach since UIImageView is not made for handling so many frames per second.
The most performant way is to render the CIImage into a MTKView (or a CAMetalLayer) using a CIContext. I created a project on GitHub - https://github.com/frankschlegel/core-image-by-example to show how this can be done.
Hope this helps!