I think the whole RAW development is now handled by the new CIRAWFilter.
This doesn't have an activeKeys property anymore. Instead, you can check for each property individually if it's supported or not. For instance with isContrastSupported or isMoireReductionSupported.
Post
Replies
Boosts
Views
Activity
I also filed a corresponding Feedback (FB9949181). Thanks for looking into it!
You can use the QR code as a mask and blend it with a solid color to effectively colorize it. So you can, for example, create a black and a white version and use the register(...) method of UIImage to "bundle" them both into one dynamic image:
let qrCode = filter.outputImage!.transformed(by: transform)
// Use the QR code as a mask for blending with a color.
// Note that we need to invert the code for that, so the actual code becomes white
// and the background becomes black, because white = let color through, black = transparent.
let maskFilter = CIFilter.blendWithMask()
maskFilter.maskImage = qrCode.applyingFilter("CIColorInvert")
// create a version of the code with black foreground...
maskFilter.inputImage = CIImage(color: .black)
let blackCIImage = maskFilter.outputImage!
// ... and one with white foreground
maskFilter.inputImage = CIImage(color: .white)
let whiteCIImage = maskFilter.outputImage!
// render both images
let blackImage = context.createCGImage(blackCIImage, from: blackCIImage.extent).map(UIImage.init)!
let whiteImage = context.createCGImage(whiteCIImage, from: whiteCIImage.extent).map(UIImage.init)!
// use black version for light mode
qrImage = blackImage
// assign the white version to be used in dark mode
qrImage.imageAsset?.register(whiteImage, with: UITraitCollection(userInterfaceStyle: .dark))
return qrImage
I recommend using Core Image to read, write, and modify EXR images.
Core Image can already natively open PNG and EXR files and write PNG data and files. However, there is no convenient way for writing EXR data of files.
That's why we wrote extensions for doing exactly that. You can find them over at Github.
Hmm, your code seems ok so far. I guess the issue is caused by something else... Are you sure the texture you create here is released properly? Also, can you please try passing nil as commandBuffer in [context render:...]?
You can simply initialize a CIImage with the texture and pass that to the kernel:
let ciImage = CIImage(mtlTexture: texture)
The documentation also mentions what you need to do to let Core Image render into a Metal texture.
If you want to incorporate a Metal processing step into a Core Image pipeline instead, I recommend you check out CIImageProcessorKernel.
Does it still exist after you released the CIContext you used for rendering? The context usually caches some intermediate images and objects to speed up consecutive render calls.
This is also filed as FB9706029. Thanks for looking into it!
It should work when you remove the -I $MTL_HEADER_SEARCH_PATHS part from your custom build rule.
Though it is mentioned in the WWDC video, it actually causes problems when MTL_HEADER_SEARCH_PATHS is empty. See this answer to a related question.
Usually, you don't need that parameter if you don't have a complicated file graph or external dependencies.
Hmm, interesting. I wonder if the filter is not designed to be re-used. Can you please try with rawFilter.previewImage instead of outputImage?
Also filed as FB9498614.
I'm sorry, I meant #include. I'm still getting fatal error: 'CoreImage/CoreImage.h' file not found when calling device.makeLibrary(source: ...).
What do you mean by "factory image"?
I'm not totally sure, but I think it's not possible to just add some random data to the end of an image file like this. Photos has its own database for storing the images and I guess they perform some kind of sanitizing/cleaning when adding new entries.
You can add custom data to a PHAsset that is stored alongside the image using PHAdjustmentData, but this is meant for storing "a description of the edits made to an asset's photo, video, or Live Photo content, which allows your app to reconstruct or revert the effects of prior editing sessions." So you would be able to read this data back, but only in an app that understands it. It won't be accessible when you just export the image out of Photos as a JPEG, for instance. And the amount of data you can store this way is also limited.
However, you might be able to store the data in the image's (EXIF) metadata somewhere. This seems to me the appropriate place.
The problem is that NSImage doesn't have the .cgImage accessor that UIImage has. It is a function with optional parameters, so to get the CGImage from an NSImage you need to do this instead:
swift
let cgImage = self.cgImage()