PhotoKit

RSS for tag

Work with image and video assets managed by the Photos app, including those from iCloud Photos and Live Photos, using PhotoKit.

Posts under PhotoKit tag

77 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Preselect PhotoPickerItems from Image when reopening PhotosPicker
I'm using the PhotosPicker in my SwiftUI project, like it is provided in the sample code (https://developer.apple.com/documentation/photokit/bringing-photos-picker-to-your-swiftui-app). Currently it is set up like that: the app loads the images and stores them in CoreData to MyEntity as binary data. However, when a user wants to update the photos there seems to be no possibility to check if the already stored image is already part of the current MyEntity. This means, there seems to be no possibility to preselect already existing images that were selected in a previous session. Trying to use localIdentifier didn't work either, as it is always nil. Now I'm wondering, if anyone has an idea on how to solve the given issue, or if there is no option to do so. Any information is appreciated. Thanks!
2
0
173
1w
App crashes after requesting PhotoLibrary limited access
My visionOS requires access to users' personal photos. The trigger mechanism is: when user firstly opens a FooView, a task attached to that FooView and calling let status = PHPhotoLibrary.authorizationStatus(for: .readWrite), if the status is .notDetermined, then calling PHPhotoLibrary.requestAuthorization(for: .readWrite, handler: authCompletionHandler) to let visionOS pop out a window to request Photo access. However, the app crashes every time when user selects Limited Access and the system try to pop out a photo library picker. And btw, I have set Prevent limited photos access alert to Yes, but it shouldn't affect the behavior here I guess. There was a debugger message here: *** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Presentations are not permitted within volumetric window scenes.' However, the window this view belongs to is a .plain style window (though there were 3D object appearing in the other view of same windowgroup) This is my code snippet if this helps: checkAndUpdatePhotoAuthorization is just a wrapper of PHPhotoLibrary.authorizationStatus(for: .readWrite) private func checkAndUpdatePhotoAuthorization() -> PHAuthorizationStatus { let currentStatus = PHPhotoLibrary.authorizationStatus(for: .readWrite) switch currentStatus { case .authorized: print("Photo library access authorized.") isPhotoGalleryAuthorized = true isPhotoGalleryLimited = false isPhotoGalleryAccessRestricted = false isPhotoGalleryDetermined = true case .limited: print("Photo library access limited.") isPhotoGalleryLimited = true isPhotoGalleryAuthorized = false isPhotoGalleryAccessRestricted = false isPhotoGalleryDetermined = true case .notDetermined: isPhotoGalleryDetermined = false print("Photo library access not determined.") case .denied: print("Photo library access denied.") isPhotoGalleryAuthorized = false isPhotoGalleryLimited = false isPhotoGalleryAccessRestricted = false showSettingsAlert = true isPhotoGalleryDetermined = true case .restricted: print("Photo library access restricted.") isPhotoGalleryAuthorized = false isPhotoGalleryLimited = false isPhotoGalleryAccessRestricted = true showPhotoAuthExplainationAlert = true isPhotoGalleryDetermined = true @unknown default: print("Photo library Unknown authorization status.") isPhotoGalleryAuthorized = false isPhotoGalleryLimited = false isPhotoGalleryAccessRestricted = false isPhotoGalleryDetermined = true } return currentStatus } And then FooView attaches task to fire up checkAndUpdatePhotoAuthorization() var body: some View { EmptyView() } .task { try? await Task.sleep(for: .seconds(1.0)) let status = self.checkAndUpdatePhotoAuthorization() if status == .notDetermined { DispatchQueue.main.async { PHPhotoLibrary.requestAuthorization(for: .readWrite, handler: authCompletionHandler) } } Another thing worth to mention is that SOMETIMES it won't crash when running on a debug build. But it crashes when it comes to TF. Any other idea? Big thanks in advance XCode version: 16.2 beta 3 VisionOS version: 2.2
1
0
208
1w
Is there a way to filter PHPickerViewController by the creation date of the assets?
Our app filters the photo library to a certain date range for ease of picking photos. However, to do this, we have to require full permissions to the photo library. We would like to use the PHPickerViewController and have it filter the results by the assets creation date? This would allow us to use it. I see other filter options, but not this one. And if it isn't there, is this something that is being thought about or on a roadmap?
1
0
193
1w
App cannot fetch any resource after a while
Hi everyone, We came an issue that, In some scenarios in our app we cannot fetch any resources from device (Photo and Contact). One case we catched is putting app in background and spending time in other commonly used apps and coming back to our app cause this issue but there is a small chance that get this issue during using the application. In cell, we are trying to fetch the image like this imageFetchTask = Task { let image = await CompositionRoot.shared.photosManager.image(requestType: .imageCollections, forId: photoAsset.photoId) self.photoImageView.image(image) } and inner layers of this code we get the PHAsset and request image PHAsset.firstAsset(for: id) let manager = PHImageManager.default() manager.requestImage(for: asset, targetSize: request.targetSize, contentMode: .aspectFill, options: request.options) { (image, info) in continuation.resume(returning: image) } We figured out that issue not happening only in Photos also Contacts and any web request. So any help according to this situation is well appreciated. Thanks
0
0
139
3w
Keyboard Navigation Not Working for inline PhotosPicker
We are using inline PhotosPicker introduced with iOS 17.0. The accessbility navigation using touch gestures work fine but the navigation through Keyboard doesn't work properly. The tab/arrow based Keyboard navigation can't move from native app process to inline PhotosUI process or vice versa. This is logged as a high severity bug by our accessibility team. Please look into this. Sample code for repro: https://github.com/saalisumer/AccessibilityIssueInlinePhotoPickerIOS Repro video: https://github.com/saalisumer/AccessibilityIssueInlinePhotoPickerIOS/blob/main/Simulator%20Screen%20Recording%20-%20iPhone%2015%20Pro%20-%202024-12-16%20at%2016.27.48.mp4
1
0
214
Dec ’24
Message from com.apple.photos.backend (PhotoKit) in log
Hello Our application is backing up the user photos to some back end. When retrieving the asset data from the Photo Library, we set the flag 'accessNetworkAllowed' to true to get the assets that might be optimized in iCloud. In the application logs, we can see the message below, and it shows as coming from com.apple.photos.backend (PhotoKit) Missing prefetched properties for PHAssetAdjustmentProperties on <PHAsset: 0x160b1ec00> BCF5688F-F7A7-4196-AFC7-A84E8BD95F3E/L0/001 mediaType=1/0, sourceType=1, (5601x3734), creationDate=2022-01-24 23:36:05 +0000, location=0, hidden=0, favorite=0, adjusted=0 . Fetching on demand on the main queue, which may degrade performance. In particular, the message says 'Fetching on demand on the main queue' but I'm not sure if that means that PhotoKit will fetch on main queue or if that mean that our application is requesting the data on main queue. Anyone could clarify? thanks
1
0
294
Dec ’24
How to add CIFilter to AVCaptureDeferredPhotoProxy
Hello, I m trying to implement deferred photo processing in my photo capture app. After I take a photo, I pass it through a CIFilter, now with the Deferred Photo Processing where would I pass the resulting photo through the CIFilter? Since there is no way for me to know when the system has finished processing a photo. If I have to do it in my app foreground every time, how do I prevent a scenario, where the user takes a photo, heads straight to the Photos App and sees the image without the filter?
2
0
330
Dec ’24
PHPhotoLibrary availability detection/notification
Hi, I'm having difficulties figuring out how I can reliably detect if the macOS system PHPhotoLibrary is available. If I place the system Photo Library on an external drive and then eject it, other apps on startup, such as Photos, will tell me that the library isn't available. How can I replicate this behavior? The only API I can find for this detection on startup is "PHPhotoLibrary.shared().unavailabilityReason". However this always returns nil. Another strange behavior is if I register a PHPhotoLibraryAvailabilityObserver class on startup when the library is available and then eject the drive, I do get a notification via photoLibraryDidBecomeUnavailable, but then directly after the call the app is terminated. This prevents the app to perform any kind of graceful termination. Is this the expected behavior? It would make sense that it's up to the developer to decide what happens if the library becomes unavailable. Thanks
0
0
262
Nov ’24
PHLivePhotoEditingContext.saveLivePhoto results in AVFoundation error -11800 "The operation could not be completed" reason An unknown error occurred (-12815)
When trying to edit some Live Photos, calling PHLivePhotoEditingContext.saveLivePhoto results in the following error: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12815), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x300d05380 {Error Domain=NSOSStatusErrorDomain Code=-12815 "(null)"}} I was able to replicate it on my device by taking a new Live Photo. Not sure what's wrong with that one specifically, not all Live Photos replicate the issue. I've submitted FB15880825 with a sysdiagnose and a Photos Diagnostics as well. Any ideas what's going on here? It's impacting multiple customers. Thanks!
0
0
271
Nov ’24
How to change PHLivePhoto EXIF metadata
I have an app that allows the user to change a photo’s EXIF metadata. To do this, I request a content editing input, get the full size image, modify its properties, create a content editing output, write the output image to the rendered content URL, then call performChanges on the PHPhotoLibrary creating an asset change request for that asset setting its content editing output. This works as expected for regular photos but Live Photos get turned off converted to a regular photo. To address this, I’m doing something similar by changing the properties of the .photo image in the Live Photo. I detect when the content editing input has a Live Photo, create a Live Photo editing context, set a frame processor that returns the frame’s image after setting its properties to the updated properties when the frame type is photo, then I create the content editing output and save the Live Photo to that output. It modifies the Live Photo successfully, but the metadata is not updated. If you get the full size image again the properties are the original properties. If you look at the EXIF metadata using an app like Metapho it remains unchanged. What am I doing wrong here? Thanks! let imageURL = contentEditingInput.fullSizeImageURL! let inputImage = CIImage(contentsOf: imageURL, options: [.applyOrientationProperty: true])! var metadata: [AnyHashable: Any] = inputImage.properties // Edit the metadata as desired... let editingContext = PHLivePhotoEditingContext(livePhotoEditingInput: contentEditingInput)! editingContext.frameProcessor = { frame, error -> CIImage? in // Edit only the still photo if frame.type == .photo { return frame.image.settingProperties(metadata) } return frame.image } let contentEditingOutput = try await withCheckedThrowingContinuation { continuation in let editingOutput = PHContentEditingOutput(contentEditingInput: contentEditingInput) editingOutput.adjustmentData = adjustmentData editingContext.saveLivePhoto(to: editingOutput) { success, error in if success { continuation.resume(returning: editingOutput) } else { continuation.resume(throwing: error!) } } } try await PHPhotoLibrary.shared().performChanges { let request = PHAssetChangeRequest(for: asset) request.contentEditingOutput = contentEditingOutput }
0
0
245
Nov ’24
LivePhoto not applying on iOS 17+ as live wallpaper
Hi fellow iOS developers! 👋 I've written a Swift code that converts a video (from a URL) into a Live Photo after downloading it. The conversion process seems fine, but when I try to set the generated Live Photo as a wallpaper on iOS 17+, it shows the message 'Motion not Available.' Has anyone else experienced this issue or know why this might be happening? Could it be related to changes in iOS 17 Live Photo handling or the generated file structure? Any help or suggestions would be greatly appreciated! 🙏
1
1
248
Nov ’24
PhotosPickerItem loadTransferable Progress is always indeterminate
I tried several times to use the PhotosPickerItem loadTransferable function with the goal of receiving some progress value especially when loading large video media. However, the Progress object returned by the function has always isIndeterminate == true and so doesn't have any progress value to observe. Is there some way to make it work ? Some configuration I might have overlooked ? Or is it just not working I might have to revert back to the UIKit photo picker because of this
1
1
322
Nov ’24
PHAssetCollectionChangeRequest.removeAssets Does Not Work
Hello! I am trying to remove a photo from my Photos Library using PhotosUI. I run into this error when I attempt to delete it: "Error returned from daemon: Error Domain=com.apple.accounts Code=7 "(null)"" No photos access scope requirements declared for changes Failed to log access with error: access= accessor:<> identifier:82068C12-FD10-4DE2-9867-B4406FBFB706 kind:intervalEvent timestampAdjustment:0 visibilityState:0 assetIdentifierCount:0 accessCount:0 tccService:kTCCServicePhotos, error=Error Domain=NSCocoaErrorDomain Code=4097 "connection to service named com.apple.privacyaccountingd" UserInfo={NSDebugDescription=connection to service named com.apple.privacyaccountingd}
2
0
322
Nov ’24
Different information values depending on how the metadata of the image is obtained (PHAsset vs PHPickerResult)
While customizing ImagePicker and using it, we find out that the metadata is not reflected normally and report it. The situation is as follows. The time or time zone of an image is changed in the Photos app. Changing the time zone of an image with an actual capture date of 2024:11:08 08:27:44 → 2024:11:07 17:27:44 Image data is extracted from a PHAsset using PHImageManager. The metadata is obtained from this image data. The time zone information exposed in the Exif tag information does not reflect the time or time zone changed in the Photos app. let asset: PHAsset = ... .... let options = PHImageRequestOptions() options.isSynchronous = true options.version = .current options.deliveryMode = .highQualityFormat options.resizeMode = .none options.normalizedCropRect = .zero options.isNetworkAccessAllowed = true options.progressHandler = { progress, error, _, _ in } PHImageManager.default().requestImageDataAndOrientation(for: asset, options: options) { imageData, uti, orientation, info in let cgImageSource = CGImageSourceCreateWithData(imageData! as CFData, nil) let properties = CGImageSourceCopyPropertiesAtIndex(cgImageSource!, 0, nil) as? Dictionary&lt;String, Any&gt; let exif = properties!["{Exif}"] let dictionary = exif as? Dictionary&lt;String, Any&gt; } Metadata Check In this case, it is reflected in the creationDate of PHAsset, so it can be somewhat compensated by forcibly replacing the metadata. However, because PHAsset does not include time zone information, when changing the time zone as well, it's impossible to calculate the correct time according to the time zone. PHPicker This issue is resolved when using the PHPickerResult provided by PHPicker. extension PhotosPickerViewController: PHPickerViewControllerDelegate { public func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) { ..... for result in results { let identifier = UTType.image.identifier if result.itemProvider.hasItemConformingToTypeIdentifier(identifier) { result.itemProvider.loadDataRepresentation(forTypeIdentifier: identifier) { data, error in guard let data = data, let cgImageSource = CGImageSourceCreateWithData(data as CFData, nil), let properties = CGImageSourceCopyPropertiesAtIndex(cgImageSource, 0, nil) as? Dictionary&lt;String, Any&gt;, let exif = properties["{Exif}"], let dictionary = exif as? Dictionary&lt;String, Any&gt; else { return } } } } } } Metadata Check Question I wonder why this happens, and if this is normal behavior. Instead of the System Picker that Apple provides as a base, I wonder if there is any way I can supplement it in that situation if I use a customizer.
0
0
369
Nov ’24
Is it possible to retrieve EXIF metadata from PHAsset without downloading photos (even if offloaded to iCloud Photo Library)?
iOS (Official) Photos app can display some EXIF-related metadata (e.g. camera and lens info, ISO, shutter speed, F-number) even when photos are offloaded to iCloud and the device is not connected to internet (e.g. airplane mode). However, with the Photos.framework, we need to download photos to retrive those metadata (which means it will not work with airplane mode). I tried the following methods, but none of those worked when photos were offloaded to iCloud and the device was in airplane mode: Requesting data with PHImageManager.default().requestImageDataAndOrientation Result: It does not return Data if the photo is not stored locally on the device, even with options.deliveryMode = .fastFormat Converting PHAsset#localIdentifier to an AssetsLibrary.framework URL (assets-library://asset/...) (I am aware that AssetsLibrary.framework is deprecated, but this was just a test.) Result: If PHImageManager does not returns Data, ALAsset#defaultRepresentation().metadata() returns an empty NSDictionary
0
1
418
Nov ’24
How to edit gain map image to preserve HDR in Live Photo
I have an app that allows you to edit your photos. To preserve HDR, I edit both the SDR image and gain map image, like so: let sdrImage = CIImage(data: data, options: [.applyOrientationProperty: true]) let gainMapImage = CIImage(data: data, options: [.applyOrientationProperty: true, .auxiliaryHDRGainMap: true]) // edit them... try CIContext().writeHEIFRepresentation(of: sdrImage, to: url, format: .RGBA8, colorSpace: colorSpace, options: [.hdrGainMapImage: gainMapImage]) I also support editing the still photo in Live Photos. To do this you create a PHLivePhotoEditingContext, set the frameProcessor block which gives you a CIImage that I edit when the frame.type is .photo, then you create a PHContentEditingOutput and call saveLivePhoto. I’m not seeing any way to preserve HDR here. Interestingly the frame processor is called twice with .photo frame.type, but I don’t see any difference between these images. How can I edit a gain map image to preserve HDR in the still photo of a Live Photo?
1
0
423
Nov ’24
LockedCameraCaptureExtension can not get access to photo library
I've requested the authentication in my main app. PHPhotoLibrary.requestAuthorization(for: .readWrite) { status in } Add the privacy description in both the main app and the extension. But No matter the device is locked or unlocked. When I call let fetchResult = PHAsset.fetchAssets(with: .image, options: nil) let count = fetchResult.count the count is always zero, even after a new photo is saved to the album in the same session.
1
0
325
Oct ’24
writeImageAtIndex:1012: ⭕️ ERROR: 'App' is trying to save an opaque image (5712x4284) with 'AlphaLast'.
I have an app that edits photos in your library. When I call try CIContext().writeHEIFRepresentation(of: editedImage, to: fileURL, format: .RGBA8, colorSpace: originalImage.colorSpace!) The following is logged to the console: writeImageAtIndex:1012: ⭕️ ERROR: 'App' is trying to save an opaque image (5712x4284) with 'AlphaLast'. This would unnecessarily increase the file size and will double (!!!) the required memory when decoding the image --> ignoring alpha. What does that mean and how can I resolve it? Xcode Version 16.0 (16A242d) iOS 18.1 (22B82)
4
9
1.1k
Dec ’24
Inquiry Regarding File Size Differences When Migrating from ALAssetsLibrary to PHPhotoLibrary
We are currently in the process of migrating our application from using ALAssetsLibrary to PHPhotoLibrary to ensure compatibility with the latest versions of iOS. However, we have noticed a discrepancy in the file sizes of images obtained using PHPhotoLibrary compared to those obtained using ALAssetsLibrary. Specifically, we would like to understand the following points: 1.Reason for File Size Differences: What are the reasons for the difference in file sizes between images obtained using ALAssetsLibrary and those obtained using PHPhotoLibrary? Could you provide detailed information on the settings and options in PHPhotoLibrary that affect the size and quality of the images? 2.Optimal Settings: What are the optimal settings in PHPhotoLibrary to obtain images with the same quality and file size as those obtained using ALAssetsLibrary? If possible, could you provide code examples or recommended option settings?
3
0
397
Oct ’24