Extract face images from a large number of photos in Photo Library using CIDetector

I am trying to iterate over images in Photo Library and extract faces using CIDetector. The images are required to keep their original resolutions. To do so, I taking the following steps:

1- Getting assets given a date interval (usually more than a year)

func loadAssets(from fromDate: Date, to toDate: Date, completion: @escaping ([PHAsset]) -> Void) {
        fetchQueue.async {
            let authStatus = PHPhotoLibrary.authorizationStatus()
            if authStatus == .authorized || authStatus == .limited {
                let options = PHFetchOptions()
                options.predicate = NSPredicate(format: "creationDate >= %@ && creationDate <= %@", fromDate as CVarArg, toDate as CVarArg)
                options.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: false)]
                let result: PHFetchResult = PHAsset.fetchAssets(with: .image, options: options)
                var _assets = [PHAsset]()
                result.enumerateObjects { object, count, stop in
                    _assets.append(object)
                }
                completion(_assets)
            } else {
                completion([])
            }
        }        
    }

where:

let fetchQueue = DispatchQueue.global(qos: .background)

2- Extracting faces

I then extract face images using:

func detectFaces(in image: UIImage, accuracy: String = CIDetectorAccuracyLow, completion: @escaping ([UIImage]) -> Void) {
        
        faceDetectionQueue.async {
            var faceImages = [UIImage]()
            let outputImageSize: CGFloat = 200.0 / image.scale
            guard let ciImage = CIImage(image: image),
                  let faceDetector = CIDetector(ofType: CIDetectorTypeFace, context: nil, options: [CIDetectorAccuracy: accuracy]) else { completion(faceImages); return }
            
            let faces = faceDetector.features(in: ciImage) // Crash happens here
             
            let group = DispatchGroup()
            
            for face in faces {
                group.enter()
                if let face = face as? CIFaceFeature {
                    let faceBounds = face.bounds
                    let offset: CGFloat = floor(min(faceBounds.width, faceBounds.height) * 0.2)
                    let inset = UIEdgeInsets(top: -offset, left: -offset, bottom: -offset, right: -offset)
                    let rect = faceBounds.inset(by: inset)
                    let croppedFaceImage = ciImage.cropped(to: rect)
                    let scaledImage = croppedFaceImage
                        .transformed(by: CGAffineTransform(scaleX: outputImageSize / croppedFaceImage.extent.width,
                                                           y: outputImageSize / croppedFaceImage.extent.height))
                    faceImages.append(UIImage(ciImage: scaledImage))
                    group.leave()
                } else {
                    group.leave()
                }
            }
            
            group.notify(queue: self.faceDetectionQueue) {
                completion(faceImages)
            }
        }
        
    }
    

where:

private let faceDetectionQueue = DispatchQueue(label: "face detection queue",
                                      qos: DispatchQoS.background,
                                      attributes: [],
                                                   autoreleaseFrequency: DispatchQueue.AutoreleaseFrequency.workItem,
                                      target: nil)
   

I use the following extension to get the image from assets:

extension PHAsset {
    var image: UIImage {
        autoreleasepool {
            let manager = PHImageManager.default()
            let options = PHImageRequestOptions()
            var thumbnail = UIImage()

            let rect = CGRect(x: 0, y: 0, width: pixelWidth, height: pixelHeight)
            options.isSynchronous = true
            options.deliveryMode = .highQualityFormat
            options.resizeMode = .exact
            options.normalizedCropRect = rect
            options.isNetworkAccessAllowed = true
            
            manager.requestImage(for: self, targetSize: rect.size, contentMode: .aspectFit, options: options, resultHandler: {(result, info) -> Void in
                if let result = result {
                    thumbnail = result
                } else {
                    thumbnail = UIImage()
                }
                })
            
            return thumbnail
        }
    }
}

The code works fine for a few (usually less that 50) assets, but for more number of images it crashes at:

let faces = faceDetector.features(in: ciImage) // Crash happens here

I get this error:

validateComputeFunctionArguments:858: failed assertion `Compute Function(ciKernelMain): missing sampler binding at index 0 for [0].'

If I reduce the size of the image fed to detectFaces(:) e.g. 400 px, I can analyze a few hundred images (usually less than 1000) but as I mentioned, using the asset's image in the original size is a requirement. My guess is it has something to do with a memory issue when I try to extract faces with CIDetector. Any idea what this error is about and how I can fix the issue?

Extract face images from a large number of photos in Photo Library using CIDetector
 
 
Q