Displaying a UIImage from CIImage in a UIImageView

Why does WWDC 2015 video 510 claim that it works to put a UIImage generated by calling UIImage(CIImage:) in a UIImageView? This has _never_ worked, and it still doesn't. The code they use is like this:


imageview.image = UIImage(CIImage:ciimage)


That displays nothing. The only way to display a UIImage based on a CIImage is to _render_ it, _explicitly_. A UIImageView is not going to magically do this for you. For example, in my own code, where the CIImage is called `blendimage`, this works fine:


            UIGraphicsBeginImageContextWithOptions(myextent.size, false, 0)
            UIImage(CIImage: blendimage).drawInRect(myextent)
            let im = UIGraphicsGetImageFromCurrentImageContext()
            UIGraphicsEndImageContext()
            self.iv.image = im


But this does nothing: the image view is blank:


            self.iv.image = UIImage(CIImage: blendimage)


The video surely would not be making this explicit radical claim if it isn't true. But it isn't true. So why do they make it? Is it _supposed_ to work but failing? Or have I just been missing something all these years that would make that line of code magically work?

Hey Matt. This following displays an image for me (at least on iOS 9.3). Oddly, the image view's content mode doesn't seem to be respected when using a CIImage-backed UIImage.

        let stripesFilter = CIFilter(name: "CIStripesGenerator")!
        stripesFilter.setValue(CIColor(red: 1, green: 0, blue: 0), forKey: "inputColor0")
        stripesFilter.setValue(CIColor(red: 1, green: 1, blue: 0), forKey: "inputColor0")
        stripesFilter.setValue(NSNumber(int: 10), forKey: "inputWidth")
       
        let cropFilter = CIFilter(name: "CICrop")!
        cropFilter.setValue(stripesFilter.outputImage!, forKey: "inputImage")
        cropFilter.setValue(CIVector(x: 0, y: 0, z: 200, w: 200), forKey: "inputRectangle")
       
        let output = cropFilter.outputImage!
       
        let image = UIImage(CIImage: output)
       
        imageView.image = image
This was asked a long time ago, but I ran into it myself, and found that the following does work:

Code Block swift4
import CoreImage
import CoreImage.CIFilterBuiltins
func getCheckWithColor(color: CIColor) -> UIImage {
        let baseImage = UIImage(named: "CheckWhite.png")!
        let bgImage = CIImage(image: baseImage)
        let colorImage = CIImage(color)
        let filter = CIFilter.multiplyCompositing()
        filter.backgroundImage = bgImage
        filter.inputImage = colorImage
        let multiplied = filter.outputImage!
        return UIImage(ciImage: multiplied).resizableImage(withCapInsets: .zero, resizingMode: .stretch)
}


The above example does the same thing as:

Code Block swift4
    let baseImage = UIImage(named: "CheckWhite.png")!
    return baseImage.withTintColor(CIColor(red: 0.8, green: 0.4, blue: 0.8, alpha: 0.75), renderingMode: .automatic).resizableImage(withCapInsets: .zero, resizingMode: .stretch)


After which I use it like such:
Code Block swift4
        let checkImageView = UIImageView(image: finalImage)
        checkImageView.contentMode = .scaleToFill
        checkImageView.frame = checkFrame


(So that was a fairly trivial example but of course you can do many more operations with CIFilter than just the UIImage withTintColor method. It's lovely that we don't have to enter strings into CIFilter anymore, in 2021.)

More generally, it seems that whether or not you need to render the image into a context or not (as per the original question) depends on the CIImage's extent (i.e. kind of like a CGImage's rect).

If it's an infinite extent, or DOUBLE_MAX, then you cannot simply convert to a UIImage. That's because CIImage aren't always actual images. As you can notice from the example, colorImage is an infinite image with a particular color.

So to answer the poster's question, it can be done. You need to check the extent property of the CIImage.
Displaying a UIImage from CIImage in a UIImageView
 
 
Q