Is it possible to sort the user library assets by date captured? The Photos app in iOS 18 lets you choose between Date Captured and Recently Added and I want to offer that same choice in my app. This seems to always sort them by creation date (which I believe is the same as recently added):
let assetCollection = PHAssetCollection.fetchAssetCollections(with: .smartAlbum, subtype: .smartAlbumUserLibrary, options: nil).firstObject!
let fetchResult = PHAsset.fetchAssets(in: assetCollection, options: PHFetchOptions.imageMediaType())
PhotoKit
RSS for tagWork with image and video assets managed by the Photos app, including those from iCloud Photos and Live Photos, using PhotoKit.
Posts under PhotoKit tag
78 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
The probability of this issue occurring is very high when retrieving a newly taken Live Photo.
This leads to failure in determining the Live Photo type. How can this issue be resolved?
With quite some excitement I read about visionOS 2's new feature to automatically turn regular 2D photos into spatial photos, using machine learning. It's briefly mentioned in this WWDC video:
https://developer.apple.com/wwdc24/10166
My question is: Can developers use this feature via an API, so we can turn any image into a spatial image, even if it is not in the device photo library?
We would like to download an image from our server, convert it on the visionPro on-the-fly and display it as a spatial photo.
Hello,
I have converted UIImage to CVPixelBuffer. I am creating a video writing app. In some cases, the same CVPixelBuffer should last in the video for 2 seconds or more.
However, I need to add 30 CVPixelBuffers per second because the video, to work on social media, must be 30 frames per second.
The problem is that whenever I try to add frames to long videos, like 50-minute videos, it gives an error.
The error is something like "Operation cannot be completed".
Give me an example of a loop to add 30 CVPixelBuffers per second to a currently written video.
Example:
while true {
if videoInput.isReadyForMoreMediaData {
break
}
if videoInput.isReadyForMoreMediaData,
let buffer = videoProvider.getNextFrame() {
adaptor.append(buffer, withPresentationTime: CMTime(value: 1, timescale: 30))
}
}
I await your response.
When I call requestAVAssetForVideo to retrieve a video for upload, the system appends a string of unknown characters to the returned path.
like this:
/var/mobile/Media/DCIM/101APPLE/IMG_1034.MOV#YnBsaXN0MDDRAQJfEBtSZxxxx1vZGUQAAgLKQAAAAAAAAEBAAAAAAAAAAMAAAAAAAAAAAAAAAAAAAAr
ps: I encountered a similar issue before when retrieving spatial videos on systems below iOS 18.
Dear
When I use iOS17 to save videos downloaded from the network to my album, it shows Domain=PHPhotosErrorDomain Code=3302. Through searching official documents, it was found that 3302 means "An error that indicates the asset resource validation failures." However, the specific reason is unknown. Is there any article to explain
I need to know how to change the info for a specific photo in iPhoto. When I scanned the photo into iPhoto on my Mac I clicked on "info" and typed in what I thought at the time was correct info. I now know that the info is wrong but I can't find a way to edit it.
I've found online articles on how to change the metadata, but that's not what I want to change. I can change the title for the photo, but I can't change the info I typed in below it. Can anyone help me?
iOS 17.4.1, iPhone 15 Pro.
I pick photos from the user's photo library using:
...
.photosPicker(isPresented: $addPhotos, selection: $pickedPhotos, matching: .images)
.onChange(of: pickedPhotos) {
import(photoItems: pickedPhotos)
}
The picker UI works ok, but then when I import the photos:
private func import(photoItems: [PhotosPickerItem]) {
for photoItem in photoItems {
Log.debug("picked: \(photoItem)")
Task {
do {
let imageData = try await photoItem.loadTransferable(type: Data.self)
guard let imageData else {
Log.error("failed to load image data")
return
}
guard let image = UIImage(data: imageData) else {
Log.error("failed to create image from data")
return
}
// use image
....
}
catch {
Log.error("failed to load image data: \(error)")
}
}
}
}
Logging the picked photo gives:
PhotosPickerItem(_itemIdentifier: "C7E2F753-43F6-413D-BA42-509C60BE9D77/L0/001", _shouldExposeItemIdentifier: false, _supportedContentTypes: [<_UTCoreType 0x1ebcd1c10> public.jpeg (not dynamic, declared), <_UTCoreType 0x1ebcd1d70> public.heic (not dynamic, declared), <UTType 0x300fe0430> com.apple.private.photos.thumbnail.standard (not dynamic, declared), <UTType 0x300fe03f0> com.apple.private.photos.thumbnail.low (not dynamic, declared)], _itemProvider: <PUPhotosFileProviderItemProvider: 0x303fdff00> {types = (
"public.jpeg",
"public.heic",
"com.apple.private.photos.thumbnail.standard",
"com.apple.private.photos.thumbnail.low"
)})
Looks like there's a valid photo? But then the loadTransferable() call fails with:
5C9D59CB-3606-48C1-9B37-1F18D642B3AD grantAccessClaim reply is an error: Error Domain=NSCocoaErrorDomain Code=4101 "Couldn’t communicate with a helper application." UserInfo={NSUnderlyingError=0x308244f30 {Error Domain=PFPAssetRequestErrorDomain Code=0 "The operation couldn’t be completed. (PFPAssetRequestErrorDomain error 0.)" UserInfo={NSURL=file:///private/var/mobile/Containers/Shared/AppGroup/36CF50FB-38FC-440E-9662-35C23B5E636C/File%20Provider%20Storage/photospicker/uuid=C7E2F753-43F6-413D-BA42-509C60BE9D77&library=1&type=1&mode=2&loc=true&cap=true.jpeg, NSLocalizedDescription=The operation couldn’t be completed. (PFPAssetRequestErrorDomain error 0.)}}}
Error loading
public.data:
Error Domain=NSItemProviderErrorDomain Code=-1000 "Cannot load representation of type public.jpeg" UserInfo={NSLocalizedDescription=Cannot load representation of type public.jpeg, NSUnderlyingError=0x3081a2550 {Error Domain=NSCocoaErrorDomain Code=4101 "Couldn’t communicate with a helper application." UserInfo={NSUnderlyingError=0x308244f30 {Error Domain=PFPAssetRequestErrorDomain Code=0 "The operation couldn’t be completed. (PFPAssetRequestErrorDomain error 0.)" UserInfo={NSURL=file:///private/var/mobile/Containers/Shared/AppGroup/36CF50FB-38FC-440E-9662-35C23B5E636C/File%20Provider%20Storage/photospicker/uuid=C7E2F753-43F6-413D-BA42-509C60BE9D77&library=1&type=1&mode=2&loc=true&cap=true.jpeg, NSLocalizedDescription=The operation couldn’t be completed. (PFPAssetRequestErrorDomain error 0.)}}}}}
2024-04-03 12:16:07.8010 error PhotosView.import: failed to load image data: importNotSupported
[
As usual I rebooted my phone as these things tend to be pretty buggy in iOS, but same error. Note this is not in a simulator which seems to have long standing bugs related to photo picking, this is on a freshly upgraded 17.4.1 device.
I can't find any documentation related to these errors and all googling comes up with a few other cases but no solutions. Should this API actually work or is it better to go back to the old UIKit stuff? I use loadTransferable(type: Data.self) as UIImage.self is not Transferable and this hack has seemed to work ok for some months.
The app crashes when creating a new album. This crash did not occur in our own testing, but after publishing it to the app store, it seems that the probability of occurrence is very high.
I am trying to implement the ability to save a photo to the user’s photo library on macOS. When I call PHPhotoLibrary.requestAuthorization(for: .addOnly) I just get a .denied status. The user is not prompted for access.
I tried adding these entitlements: com.apple.security.personal-information.photos-library, com.apple.security.assets.pictures.read-write. I tried turning off sandboxing entirely.
I tried saving despite getting the authorization denied, but unsurprisingly that gives me this error: Domain=PHPhotosErrorDomain Code=3311
I can almost do what i want with NSSharingService(named: .addToIPhoto), but that has the side effect of launching Photos.
Is there a trick to getting PHPhotoLibrary.requestAuthorization(for: .addOnly) to work?
Thanks.
John
I want to get only spatial video while open the Photo library in my app. How can I achieve?
One more thing, If I am selecting any video using photo library then how to identify selected video is Spatial Video or not?
self.presentPicker(filter: .videos)
/// - Tag: PresentPicker
private func presentPicker(filter: PHPickerFilter?) {
var configuration = PHPickerConfiguration(photoLibrary: .shared())
// Set the filter type according to the user’s selection.
configuration.filter = filter
// Set the mode to avoid transcoding, if possible, if your app supports arbitrary image/video encodings.
configuration.preferredAssetRepresentationMode = .current
// Set the selection behavior to respect the user’s selection order.
configuration.selection = .ordered
// Set the selection limit to enable multiselection.
configuration.selectionLimit = 1
let picker = PHPickerViewController(configuration: configuration)
picker.delegate = self
present(picker, animated: true)
}
`func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) {
picker.dismiss(animated: true) {
// do something on dismiss
}
guard let provider = results.first?.itemProvider else {return}
provider.loadFileRepresentation(forTypeIdentifier: "public.movie") { url, error in
guard error == nil else{
print(error)
return
}
// receiving the video-local-URL / filepath
guard let url = url else {return}
// create a new filename
let fileName = "\(Int(Date().timeIntervalSince1970)).\(url.pathExtension)"
// create new URL
let newUrl = URL(fileURLWithPath: NSTemporaryDirectory() + fileName)
print(newUrl)
print("===========")
// copy item to APP Storage
//try? FileManager.default.copyItem(at: url, to: newUrl)
// self.parent.videoURL = newUrl.absoluteString
}
}`
Hello) I use PHImageManager.default() for fetch UIImage from iOS gallery (Photo App). I noticed that screenshot images from galley has bitsPerComponent value equal 16. That's supported only in Mac OS.
I could ONLY fix this by redrawing the image using UIGraphicsBeginImageContextWithOptions.
private func redrawImageTo8Bit() -> UIImage? {
// UIGraphicsImageRenderer can not change bitsPerComponent value
UIGraphicsBeginImageContextWithOptions(size, false, UIScreen.main.scale)
draw(in: CGRect(origin: .zero, size: size))
let redrawedImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return redrawedImage
}
https://developer.apple.com/library/archive/documentation/GraphicsImaging/Conceptual/drawingwithquartz2d/dq_context/dq_context.html
Example of using PHImageManager.default
PHImageManager.default().requestImage(
for: phAsset,
targetSize: size,
contentMode: contentMode,
options: options
) { [weak self] image, info in
if let image = image // Screenshot of an image taken on the device (iPhone 12 Pro Max) {
let bitsPerComponent = image.cgImage?.bitsPerComponent // 16
}
}
Hello, I fetch Live Photo AVAsset using: PHImageManager and PHAssetResourceManager for getting Data. And then I want to wrap it to AVAsset using fileURL, and everything works fine, but also I want to trim this AVAsset using AVMutableComposition. I use insertTimeRange of AVMutableCompositionTrack method, and I don't not why but naturalSize of originalVideoTrack and newVideoTrack are different, and this happening only with Live Photo, default Videos work fine. Seems like this is AVMutableCompositionTrack bug inside AVFoundation. Please give me some info. Thanks)
PHImageManager.default().requestLivePhoto(
for: phAsset,
targetSize: size,
contentMode: .aspectFill,
options: livePhotoOptions
) { [weak self] livePhoto, info in
guard let livePhoto else {
return
}
self?.writeAVAsset(livePhoto: livePhoto, fileURL: fileURL)
}
private func writeAVAsset(livePhoto: PHLivePhoto, fileURL: URL) {
let resources = PHAssetResource.assetResources(for: livePhoto)
guard let videoResource = resources.first(where: { $0.type == .pairedVideo }) else {
return
}
let requestOptions = PHAssetResourceRequestOptions()
var data = Data()
dataRequestID = PHAssetResourceManager.default().requestData(
for: videoResource,
options: requestOptions,
dataReceivedHandler: { chunk in
data.append(chunk)
},
completionHandler: { [weak self] error in
try data.write(to: fileURL)
let avAsset = AVAsset(url: fileURL)
let composition = AVMutableComposition()
if let originalVideoTrack = tracks(withMediaType: .video).first,
let videoTrack = composition.addMutableTrack(withMediaType: .video, preferredTrackID: 0)
{
// originalVideoTrack has naturalSize (width: 1744, height: 1308)
try? videoTrack.insertTimeRange(timeRange, of: originalVideoTrack, at: .zero)
videoTrack.preferredTransform = originalVideoTrack.preferredTransform
// videoTrack has naturalSize (width: 1920.0, height: 1440.0)
}
)
}
Hi, I am new to Swift and trying to develop an app that needs photolibraryusagedescription and photolibraryaddtionsusagedescription.
On simulation, I have fetched all photos and delete selected photos successfully but can't share selected photos so I've connect real device to test it out. However, after allow full access to photo album, I got error Domain=NSCocoaErrorDomain Code=4097, "connection to service with pid 1353 named com.apple.privacyaccountingd" UserInfo={NSDebugDescription=connection to service with pid 1353 named com.apple.privacyaccountingd}
I've looked up for solutions so far but still have no idea what created that issue and how to solve it, I've been using the latest version of all devices and apps related.
Could someone guide me about this issue? Thank you
Here's the code of my ContentView, and function I use to request access, respectively:
import Photos
struct ContentView: View {
@State var showGuide: Bool = false
@State var showInfo: Bool = false
@StateObject private var photoFetcher = PhotoFetcher()
@State private var photoOffset: CGSize = .zero
@State private var animateOut = false
@State private var refreshTrigger = false
@State private var showingShareSheet = false
@State private var itemsToShare: [Any] = []
func likePhoto() {
guard !self.photoFetcher.destinations.isEmpty else { return }
// Some animation later
DispatchQueue.main.asyncAfter(deadline: .now() + 0.5) {
if !self.photoFetcher.destinations.isEmpty {
let tempPhoto = self.photoFetcher.destinations.removeLast()
self.photoFetcher.destinations.insert(tempPhoto, at: 0)
}
photoOffset = .zero
animateOut = false
}
}
func dislikePhoto() {
guard !self.photoFetcher.destinations.isEmpty else { return }
let deletePhoto = self.photoFetcher.destinations.last?.asset
guard let deleteAsset = deletePhoto else { return }
PHPhotoLibrary.shared().performChanges({
PHAssetChangeRequest.deleteAssets([deleteAsset] as NSArray)
}) { success, error in
DispatchQueue.main.async {
if success {
self.photoFetcher.destinations.removeLast()
self.refreshTrigger.toggle()
} else {
print("Error deleting photo")
}
}
}
}
func sharePhoto() {
guard let topAsset = photoFetcher.destinations.last?.asset else { return }
let manager = PHImageManager.default()
let options = PHImageRequestOptions()
options.version = .current
options.isSynchronous = false
options.deliveryMode = .highQualityFormat
manager.requestImage(for: topAsset, targetSize: CGSize(width: UIScreen.main.bounds.width, height: UIScreen.main.bounds.height), contentMode: .aspectFit, options: options) { image, _ in
if let image = image {
DispatchQueue.main.async {
self.shareImage(image: image)
}
print("shareImage called")
}
}
}
func shareImage(image: UIImage) {
itemsToShare = [image]
showingShareSheet = true
print("Item to share: \(itemsToShare)")
}
var body: some View {
VStack{
Spacer(minLength: 20)
ZStack {
ForEach(photoFetcher.destinations, id: \.id) { oldPic in
CardView(showGuideView: $showGuide, showInfoView: $showInfo, oldPics: oldPic)
}
}
.padding(.horizontal)
Spacer()
CardNav(onLike: {
likePhoto()
}, onDislike: {
dislikePhoto()
}, onShare: {
sharePhoto()
})
Spacer(minLength: 20)
}
.onAppear(perform: photoFetcher.requestPhotoLibraryAccess)
.alert(isPresented: $photoFetcher.showAlert) {
Alert(
title: Text("Permission required"),
message: Text(photoFetcher.alertMessage),
primaryButton: .default(Text("Settings"), action: {
// Open the app's settings
UIApplication.shared.open(URL(string: UIApplication.openSettingsURLString)!, options: [:], completionHandler: nil)
}),
secondaryButton: .cancel()
)
}
.sheet(isPresented: $showingShareSheet) {
ActivityViewController(activityItems: itemsToShare, applicationActivities: nil)
}
}
}
#Preview {
ContentView()
}
PHPhotoLibrary.requestAuthorization {
status in
DispatchQueue.main.async {
if status == .authorized {
self.fetchPhotos()
} else {
// Do something
self.handleDeniedAccess()
}
}
}
}
Hi!
Recently our app had a strange crash. It's about PHPhotoLibrary. We can't reproduce it. And we can sure the user has authorized the album. It has been troubling us for a month.
Has anyone encountered similar problems?
Exception: Access to the photo library is not allowed until the photo library is obtainable.
Here is the crash stack.
Stack trace:
Fatal Exception: NSInternalInconsistencyException
0 CoreFoundation 0xecb28 __exceptionPreprocess
1 libobjc.A.dylib 0x2af78 objc_exception_throw
2 PhotoLibraryServices 0x60d6c -[PLPhotoLibrary loadDatabaseWithOptions:error:]
3 PhotoLibraryServices 0x64dec +[PLPhotoLibrary newPhotoLibraryWithName:loadedFromBundle:options:error:]
4 Photos 0xf2810 newPhotoLibrary
5 Photos 0xf2328 __50-[PHPhotoLibrary initWithPhotoLibraryBundle:type:]_block_invoke_4
6 PhotoLibraryServicesCore 0x446c __27-[PLLazyObject objectValue]block_invoke
7 PhotoLibraryServicesCore 0x43e0 PLResultWithUnfairLock
8 PhotoLibraryServicesCore 0x4388 -[PLLazyObject objectValue]
9 Photos 0xf092c -[PHPhotoLibrary backgroundQueuePhotoLibrary]
10 Photos 0xf0830 -[PHPhotoLibrary photoLibraryForCurrentQueueQoS]
11 Photos 0xf72bc -[PHPhotoLibrary(PXCPLStatus) managedObjectContextForCurrentQueueQoS]
12 Photos 0x17d800 -[PHQuery createFetchRequestIncludingBasePredicate:]
13 Photos 0x17d37c -[PHQuery fetchRequest]
14 Photos 0x1fb2fc -[PHFetchResult initWithQuery:oids:registerIfNeeded:usingManagedObjectContext:]
15 Photos 0x1fb16c -[PHFetchResult initWithQuery:]
16 Photos 0x17bc8c __23-[PHQuery executeQuery]block_invoke
17 Photos 0x17ee74 PHQueryForAssetCollectionType_Album_block_invoke_97
18 Photos 0x17bb98 -[PHQuery executeQuery]
19 Photos 0x3a744 __67+[PHAssetCollection fetchAssetCollectionsWithType:subtype:options:]block_invoke
20 Photos 0x1704e4 +[PHObject authorizationAwareFetchResultWithOptions:fetchBlock:]
21 Photos 0x3a69c +[PHAssetCollection fetchAssetCollectionsWithType:subtype:options:]
22 INSBusinessServices 0x46b994 PHPhotoLibrary.fetchAlbums(withName:) + 391 (PHPhotoLibrary+Custom.swift:391)
23 Insta360 ONE R 0x5008ac closure #1 in Album.loadMediasAtSystemAlbum(notNeedTypeIsVideo:) + 122 (Album+Assets.swift:122)
24 ReactiveSwift 0x19b70 closure #1 in SignalProducer.init(:) + 112 (SignalProducer.swift:112)
25 ReactiveSwift 0x2da50 partial apply for closure #1 in GeneratorCore.makeInstance() + 433 (SignalProducer.swift:433)
26 ReactiveSwift 0x192d8 SignalProducer.startWithSignal(:) + 227 (SignalProducer.swift:227)
27 ReactiveSwift 0x2eb14 partial apply for closure #1 in closure #1 in SignalProducer.start(on:) + 2045 (SignalProducer.swift:2045)
28 INSFoundationKit 0xe5010 partial apply for closure #1 in GCDGloabalScheduler.schedule(:) + 19 (GCDAide.swift:19)
29 INSFoundationKit 0xe50c4 partial apply for closure #1 in static GCDAide.asyncOnGlobal(after::)
30 INSFoundationKit 0x25338 thunk for @escaping @callee_guaranteed () -> ()
31 libdispatch.dylib 0x213c _dispatch_call_block_and_release
32 libdispatch.dylib 0x3dd4 _dispatch_client_callout
33 libdispatch.dylib 0x72d8 _dispatch_continuation_pop
34 libdispatch.dylib 0x68f4 _dispatch_async_redirect_invoke
35 libdispatch.dylib 0x15894 _dispatch_root_queue_drain
36 libdispatch.dylib 0x1609c _dispatch_worker_thread2
37 libsystem_pthread.dylib 0x1ee4 _pthread_wqthread
38 libsystem_pthread.dylib 0x1fc0 start_wqthread
I'm trying to add a video asset to my app's photo library, via drag/drop from the Photos app.
I managed to get the video's URL from the drag, but when I try to create the PHAsset for it I get an error:
PHExternalAssetResource: Unable to issue sandbox extension for /private/var/mobile/Containers/Data/Application/28E04EDD-56C1-405E-8EE0-7842F9082875/tmp/.com.apple.Foundation.NSItemProvider.fXiVzf/IMG_6974.mov
Here's my code to add the asset:
let url = URL(string: videoPath)!
PHPhotoLibrary.shared().performChanges({
PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: url)
}) { saved, error in
// Error !!
}
Addictionally, this check is true in the debugger:
UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(videoPath) == true
Note that adding still images, much in the same way, works fine. And I naturally have photo library permissions enabled.
Any idea what I'm missing?
I'm seeing the same error on iOS17.2 and iPadOS 17.2, with Xcode 15.2.
Thanks for any tips ☺️
Hello. Does anyone have any ideas on how to work with the new iOS 17 Live Photo? I can save the live photo, but I can't set it as wallpaper. Error: "Motion is not available in iOS 17" There are already applications that allow you to do this - VideoToLive and the like. What should I use to implement this with swift language? Most likely the metadata needs to be changed, but I'm not sure.
Im trying to organize photos from a shared photo library so that there is more organization using albums, or folders, or any other logical structure. If you have any suggestions on how I might do this so that everyone using the shared library also has the same organizational view, that would be appreciated. Thanks very much.
In my app I use PhotosPicker to select images. After selection the images the image data will be saved in a CoreData entity - this works fine.
However, When the user wants to add more images and go back to adding photos with PhotosPicker - how can I reference the already added images and show them as selected in PhotosPicker?
The imageIdentifier is not allowed to use, so how can I do get a reference to the selected images to display them as selected in PhotosPicker?
Thanks for any hint!
Hello Apple Developer Community,
I'm excited to make my first post here and am seeking guidance for a feature I'd like to implement in my app. My objective is to enable users to select an image and crop it. Ideally, there should be a visible indicator, like a rectangle, to show the area that will be cropped. Upon clicking the save button, the image would be saved with the selected cropped area.
I'm aiming for functionality to the image editor in the Photos app. Is there a straightforward method or integration for this that adheres to Apple's native frameworks, without resorting to external GitLab repositories?
Thank you in advance for your assistance.
Best regards,
Nicola