AVCapturePhotoOutput image

I've ask aFace Detection question on Stackoverflow.

The face dection founction works fine if I just read image from Assets.xcassets. However, the same face detection function fails with input from AVCapturePhotoOutput .


In WWDC 2016 (501), it mentions about there would be an JPEG image, and an preview image (decompressed, downscaled image).


| -> JPEG -> NAND

Camera = |

| -> Decompressed, downscaled image -> Display


Here's my question, what does jpegPhotoDataRepresentation return?

jpegPhotoDataRepresentation(forJPEGSampleBuffer JPEGSampleBuffer: CMSampleBuffer, previewPhotoSampleBuffer: CMSampleBuffer?) -> Data?


When the previewPhotoSampleBuffer is nil, it returns the Data contaning bits in the JPEG file format.

When the previewPhotoSampleBuffer is not nil, it returns the Data containing bits in the preview file format.

Am I right?


Further question, why the face detection founction fails with the jpegPhotoDataRepresentation without previewPhotoSampleBuffer?

if let sampleBuffer = photoSampleBuffer,
            let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: nil) {
                self.imageView.image = UIImage(data: dataImage)
                self.imageView.isHidden = false
                self.previewLayer?.isHidden = true
                self.findFace(img: self.imageView.image!)
            }