I am fetching moments from the local device, and looping through to append what is found into 2 arrays:
- 1st array contains moment "metadata", all the alphanumeric data types, called var moments = [Moment]()
- 2nd array contains an array of UIImage of all photos per moment, called var images = [[UIImage]]()
My current code seems to be producing an extra "moment" in the `images` variable, which I can see manually in the Photos app. I just don't understand why within the same loop, 1 variable includes this "moment" and the other does not.
---
func gatherLocalMoments(fromDate: Date?) { // ADD AN OPTIONS PARAMETER = all, last 2 weeks, user selection, etc
let photoOptions = PHFetchOptions()
photoOptions.sortDescriptors = [NSSortDescriptor(key: "startDate", ascending: false)]
let fetchOptions = PHFetchOptions()
fetchOptions.sortDescriptors = [NSSortDescriptor(key: "startDate", ascending: true)]
if let date = fromDate {
// Valid date set
let predicate = NSPredicate(format: "startDate > %@", date as NSDate)
photoOptions.predicate = predicate
}
let imageHandler = PHImageManager.default()
if let allMoments : PHFetchResult = PHAssetCollection.fetchAssetCollections(with: PHAssetCollectionType.moment, subtype: PHAssetCollectionSubtype.any, options: photoOptions) {
if allMoments.count > 0 {
for i in 0..<allMoments.count {
// GET MOMENT METADATA
if let moment : PHAssetCollection = allMoments.object(at: i) {
if let lon = moment.approximateLocation?.coordinate.longitude {
if let lat = moment.approximateLocation?.coordinate.latitude {
let thisMoment = Moment(localizedNames: moment.localizedLocationNames, startDate: moment.startDate, longitude: lon, latitude: lat, estimatedCount: moment.estimatedAssetCount)
moments.append(thisMoment)
}
}
// GET MOMENT IMAGES
if let fetchResult : PHFetchResult = PHAsset.fetchAssets(in: moment, options: nil) {
if fetchResult.count > 0 {
var momentImages = [UIImage]()
for j in 0..<fetchResult.count {
imageHandler.requestImage(for: fetchResult.object(at: j), targetSize: CGSize(width: 100, height: 100), contentMode: PHImageContentMode.aspectFill, options: nil, resultHandler: {
image, info in
if let image = image {
print("i: \(i), j: \(j)")
momentImages.append(image)
}
})
}
images.append(momentImages)
}
}
}
}
}
}
}
---
Running this, I had 6 "moments". The contents of var moments:
- Moment 0 = 10 images
- Moment 1 = 49 images
- Moment 2 = 3 images
- Moment 3 = 17 images
- Moment 4 = 3 images
- Moment 5 = 20 images
However, it is actually creating an extra "moment" in the images array. This is the contents of var images:
- Moment 0 = 10 images
- Moment 1 = 49 images
- Moment 2 = 3 images
- Moment 3 = 3 images // < -- This is the extra moment
- Moment 4 = 17 images
- Moment 5 = 3 images
- Moment 6 = 20 images
If i go into my Photos, I can see these extra images. The "header/sub-header" are listed differently than other photos. I am not sure if it's technically a "moment" or something else.
The event on 17th July. It has no 2nd/sub-title row. It just looks slightly different. I believe these are photos taken within another app I use for collecting receipts. I didn't even realise it was saving these images to my library in this way.
In any case, my expected behaviour of the 2 loops above:
- ONLY if it's a recognised "moment" will it enter the loop and populate
array; andmoments
- ONLY within a recognised "moment" THEN will it grab the associated images in the
arrayimages
- "Non-moments" should be ignored altogether by both arrays
Ultimately, the current app architecture means I need the elements of both arrays to match up (whichever is the correct answer!)
Can I ask for assistance with this please?