PhotoKit

RSS for tag

Work with image and video assets managed by the Photos app, including those from iCloud Photos and Live Photos, using PhotoKit.

Posts under PhotoKit tag

87 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Convert a PhotosPickerItem to an Image in SwiftUI on MacOS
I'm trying to provide a mechanism for the user to select a bg image from their photo library. I've got this code which theoretically is getting the data but can't figure out how to convert it to an Image in SwiftUI. I've found plenty of examples on iOS but none workable on MacOS. This is my implementation of the PhotosPicker on iOS: PhotosPicker(selection: $vm.selectedBGphoto, matching: .images) { Label("Select Background Image", systemImage: "photo.on.rectangle") } .tint(.purple) .controlSize(.large) .onChange(of: vm.selectedBGphoto) { newItem in Task { if let data = try? await newItem?.loadTransferable(type: Data.self) { if let uiImage = UIImage(data: data) { vm.selectedBGdata = data vm.bgImage = Image(uiImage: uiImage) } } } This DOES NOT work on MacOS because of the UIImage reference. How do I convert what I get from PhotosPicker into an Image on MacOS?
3
0
1.3k
Oct ’23
how to get the proraw image output with 1:1, 16:9
Now I use AVFoundation framework to get the photo output, but the image aspect ratio is 4:3. But according to the Camera app in the iPhone 13 Pro, it has server image aspect ratio: 4:3, 16:9 and 1:1 when take the proraw image. So how can I get the 1:1, 16:9 aspect ratio proraw image? After I do some research, I find that no matter you use which kinds of camera in the iPhone 11, 12, 13, 14, 15 or Pro, the result image is always 4:3, 1:1 and 16:9 come from the 4:3 cropping. If it is true, how can I crop the proraw file without any data lossing? My developer environment: iPhone 13 Pro iOS 16.7 xcode 14.3.1 This is the session configuration code for the camera device configuration. session.beginConfiguration() /* Do not create an AVCaptureMovieFileOutput when setting up the session because Live Photo is not supported when AVCaptureMovieFileOutput is added to the session. */ session.sessionPreset = .photo // Add video input. do { var defaultVideoDevice: AVCaptureDevice? if let backCameraDevice = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back) { // If a rear dual camera is not available, default to the rear wide angle camera. defaultVideoDevice = backCameraDevice } else if let frontCameraDevice = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .front) { // If the rear wide angle camera isn't available, default to the front wide angle camera. defaultVideoDevice = frontCameraDevice } guard let videoDevice = defaultVideoDevice else { print("Default video device is unavailable.") setupResult = .configurationFailed session.commitConfiguration() return } let videoDeviceInput = try AVCaptureDeviceInput(device: videoDevice) if session.canAddInput(videoDeviceInput) { session.addInput(videoDeviceInput) self.videoDeviceInput = videoDeviceInput } else { print("Couldn't add video device input to the session.") setupResult = .configurationFailed session.commitConfiguration() return } } catch { print("Couldn't create video device input: \(error)") setupResult = .configurationFailed session.commitConfiguration() return } // check the lens list let camerasOptions = videoDeviceDiscoverySession.devices var availableCameras: [AVCaptureDevice.DeviceType] = [] if camerasOptions.isEmpty { print("no camera devices") } else { for camera in camerasOptions { if camera.deviceType == .builtInUltraWideCamera || camera.deviceType == .builtInWideAngleCamera || camera.deviceType == .builtInTelephotoCamera { if !availableCameras.contains(camera.deviceType) { availableCameras.append(camera.deviceType) } } } } DispatchQueue.main.async { self.lensList = availableCameras } // Add the photo output. if session.canAddOutput(photoOutput) { session.addOutput(photoOutput) photoOutput.isHighResolutionCaptureEnabled = true photoOutput.maxPhotoQualityPrioritization = .quality print(photoOutput.isAppleProRAWSupported) // Use the Apple ProRAW format when the environment supports it. photoOutput.isAppleProRAWEnabled = photoOutput.isAppleProRAWSupported DispatchQueue.main.async { self.isSupportAppleProRaw = self.photoOutput.isAppleProRAWSupported } } else { print("Could not add photo output to the session") setupResult = .configurationFailed session.commitConfiguration() return } session.commitConfiguration()
1
0
471
Oct ’23
PHPickerViewController: Cannot Load Photos Taken Within 30 Seconds Using iOS 17.0.3 Camera App
In iOS 17.0.3, photos taken using Apple's native camera app can't be loaded immediately (within approximately 30 seconds) through PHPickerViewController. Specifically, the method itemProvider.canLoadObject(ofClass: UIImage.self) returns false. However, after about 30 seconds post-capture, the photos load without any hindrance. Initially, I considered an issue with my own photo-loading code, but the same problem persists even with Apple's official PHPickerDemo sample code. [PHPickerDemo - SelectingPhotosAndVideosInIOS.zip] https://developer.apple.com/documentation/photokit/selecting_photos_and_videos_in_ios ViewController.swift Line 89 (PHPickerDemo) func displayNext() { guard let assetIdentifier = selectedAssetIdentifierIterator?.next() else { return } currentAssetIdentifier = assetIdentifier let progress: Progress? let itemProvider = selection[assetIdentifier]!.itemProvider if itemProvider.canLoadObject(ofClass: PHLivePhoto.self) { progress = itemProvider.loadObject(ofClass: PHLivePhoto.self) { [weak self] livePhoto, error in DispatchQueue.main.async { self?.handleCompletion(assetIdentifier: assetIdentifier, object: livePhoto, error: error) } } } else if itemProvider.canLoadObject(ofClass: UIImage.self) { <========================================= FALSE progress = itemProvider.loadObject(ofClass: UIImage.self) { [weak self] image, error in DispatchQueue.main.async { self?.handleCompletion(assetIdentifier: assetIdentifier, object: image, error: error) } } } ...omitted... } Environment & Settings: iPhone 12 iOS 17.0.3 Settings -> Camera -> Formats -> High Efficiency (Enabled) Reproduction Steps: Take a photo in normal photo mode with Apple's native camera app (not in portrait mode). Launch the PHPickerDemo app. Tap the photo icon located in the top right. Observe that the photo fails to load. Workarounds: Wait for over 30 seconds prior to selecting the photo. Opt to shoot in portrait mode rather than the standard photo mode. Switch on Settings -> Camera -> Formats -> Most Compatible. I am developing a photo editing app, and I have received many emails from users stating that they cannot select photos since updating to iOS 17. Thanks.
2
1
844
Oct ’23
how to execute photoLibraryDidChange(_:) when the app is in background?
Goal is to get/save the captured photo (From default camera) immediately from my app when the app is in background. When I capture a photo with default camera, photoLibraryDidChange(_:) function do not execute that time. But when I reopen my app, that time this function executes and give the images, which were captured in that particular time. how to execute photoLibraryDidChange(_:) when app is in background?
1
0
372
Oct ’23
Accessing "From my mac" in PhotoKit
Is it possible to access "From my mac" photos/PHAssetCollection through PhotoKit in iOS? "From my mac" photos/videos are media synced from a mac where iCloud Photos are turned off on the iOS device, like what we did in the ole' days before iCloud Photos. I have set up an iOS device with "From my mac" albums present in Photos.app, but in my own app I don't seem to be able to access those collections/photos through PhotoKit using all the defined PHAssetCollectionTypes. Are these directly synced photos simply not available through PhotoKit and you would have to revert to the deprecated ALAssetLibrary?
4
0
604
Nov ’23
UIImageView preferredImageDynamicRange not working
I am trying to display HDR Images (ProRAW) within UIImageView using preferredImageDynamicRange. This was shown in a 2023 WWDC Video let imageView = UIImageView() if #available(iOS 17.0, *) { self.imageView.preferredImageDynamicRange = UIImage.DynamicRange.high } self.imageView.clipsToBounds = true self.imageView.isMultipleTouchEnabled = true self.imageView.contentMode = .scaleAspectFit self.photoScrollView.addSubview(self.imageView) I pull the image from PHImageManager: let options = PHImageRequestOptions() options.deliveryMode = .highQualityFormat options.isNetworkAccessAllowed = true PHImageManager.default().requestImage(for: asset, targetSize: self.targetSize(), contentMode: .aspectFit, options: options, resultHandler: { image, info in guard let image = image else { return } DispatchQueue.main.async { self.imageView.image =image if #available(iOS 17.0, *) { self.imageView.preferredImageDynamicRange = UIImage.DynamicRange.high } } } Issue The image shows successfully, yet not in HDR mode (no bright specular highlights, as seen when the same image ((ProRAW) is pulled on the native camera app. What am I missing here?
1
0
684
Sep ’23
IPhone system album video can not play
We have received a lot of user feedback, saying that our app caused the video in the user's system album to not play, we did reproduce this phenomenon after operating some modules of our app many times, after monitoring the device log, click on the system album z probably received the following abnormal error VideoContentProvider received result:<AVPlayerItem: 0x281004850, asset = <AVURLAsset: 0x28128fce0, URL = file:///var/mobile/Media/DCIM/100APPLE/IMG_0085.MP4>>, info:{ PHImageResultRequestIDKey = 316; }, priority:oneup automatic, strategy:<PXDisplayAssetVideoContentDeliveryStrategy: 0x2836c3000>quality: medium+(med-high), segment:{ nan - nans }, streaming:YES, network:YES, audio:YES, targetSize:{1280, 1280}, displayAsset:8E30C461-B089-4142-82D9-3A8CFF3B5DE9 <PUBrowsingVideoPlayer: 0xc46a59770> Asset : <PHAsset: 0xc48f5fc50> 8E30C461-B089-4142-82D9-3A8CFF3B5DE9/L0/001 mediaType=2/524288, sourceType=1, (828x1792), creationDate=2023-07-19 上午7:36:41 +0000, location=0, hidden=0, favorite=0, adjusted=0 VideoSession : <PXVideoSession 0xc48a1ec50> { Content Provider: <PXPhotoKitVideoContentProvider: 0x282d441e0>, Asset <PHAsset: 0xc48f5fc50> 8E30C461-B089-4142-82D9-3A8CFF3B5DE9/L0/001 mediaType=2/524288, sourceType=1, (828x1792), creationDate=2023-07-19 上午7:36:41 +0000, location=0, hidden=0, favorite=0, adjusted=0 , Media Provider: <PUPhotoKitMediaProvider: 0x28104da70> Desired Play State: Paused Play State: Paused Stalled: 0 At Beginning: 1 End: 0 Playback: ‖ Paus √ b0 a0 s0 l1 f0 e0 r0.0 0.000/60.128 VideoOutput: (null) Got First Pixel Buffer: NO Pixel Buffer Frame Drops: 0 Buffering: 0 }: Starting disabling of video loading for reason: OutOfFocus <PUBrowsingVideoPlayer: 0xc46de66e0> Asset : <PHAsset: 0xc48f5f1d0> 11ECA95E-0B79-4C7C-97C6-5958EE139BAB/L0/001 mediaType=2/0, sourceType=1, (1080x1920), creationDate=2023-09-21 上午7:54:46 +0000, location=1, hidden=0, favorite=0, adjusted=0 VideoSession : (null): Starting disabling of video loading for reason: OutOfFocus I think this message is imporant VideoSession : (null): Starting disabling of video loading for reason: OutOfFocus restart the iPhone can resolve this anomalous ,can you know reason or how to avoid this bug the bug like :https://discussionschinese.apple.com/thread/254766045 https://discussionschinese.apple.com/thread/254787836
1
0
377
Sep ’23
Photos regression in iOS 17 likely: -[PHPhotoLibrary presentLimitedLibraryPickerFromViewController:completionHandler:]: unrecognized selector
When selecting more photos with previous limited authorization, I get this crash on iOS 17.0 *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[PHPhotoLibrary presentLimitedLibraryPickerFromViewController:completionHandler:]: unrecognized selector sent to instance 0x105ea2a60' when using the synchronous and asynchronous methods: PHPhotoLibrary.shared().presentLimitedLibraryPicker(from: viewController) { newlySelectedPhotoIDs in ... OR let newlySelectedPhotoIDs = await PHPhotoLibrary.shared().presentLimitedLibraryPicker(from: viewController) Debugger output is unexpected... after all these methods are in the header... (lldb) po PHPhotoLibrary.shared().responds(to: #selector(PHPhotoLibrary.presentLimitedLibraryPicker(from:))) false (lldb) po PHPhotoLibrary.shared().responds(to: #selector(PHPhotoLibrary.presentLimitedLibraryPicker(from:completionHandler:))) false
4
2
1.6k
Oct ’23
Cannot retrieve UIImage from PHPickerViewController, it returns nil.
Hello, after updating the physical device to iOS17, it seems there's an issue with the ImagePicker's functionality. In our app, even though NSItemProvider indicates canLoadObject(ofClass: UIImage.self) returns true, the method loadObject(ofClass: UIImage.self) { (object, error) consistently returns nil. There's also a possibility that the same phenomenon is occurring with the standard Notes app's image picker, preventing images from being passed to the app.
3
1
559
Oct ’23
[PHImageManager requestImage] crash only iOS17.0
Unhandled error (NSCocoaErrorDomain, 134093) occurred during faulting and was thrown: Error Domain=NSCocoaErrorDomain Code=134093 "(null)" Fatal Exception: NSInternalInconsistencyException 0 CoreFoundation 0xed5e0 __exceptionPreprocess 1 libobjc.A.dylib 0x2bc00 objc_exception_throw 2 CoreData 0x129c8 _PFFaultHandlerLookupRow 3 CoreData 0x11d60 _PF_FulfillDeferredFault 4 CoreData 0x11c58 _pvfk_header 5 CoreData 0x98e64 _sharedIMPL_pvfk_core_c 6 PhotoLibraryServices 0x6d8b0 -[PLInternalResource orientation] 7 PhotoLibraryServices 0x6d7bc -[PLInternalResource orientedWidth] 8 Photos 0x147e74 ___presentFullResourceAtIndex_block_invoke 9 PhotoLibraryServices 0x174ee4 __53-[PLManagedObjectContext _directPerformBlockAndWait:]_block_invoke 10 CoreData 0x208ec developerSubmittedBlockToNSManagedObjectContextPerform 11 libdispatch.dylib 0x4300 _dispatch_client_callout 12 libdispatch.dylib 0x136b4 _dispatch_lane_barrier_sync_invoke_and_complete 13 CoreData 0x207f8 -[NSManagedObjectContext performBlockAndWait:] 14 PhotoLibraryServices 0x174e98 -[PLManagedObjectContext _directPerformBlockAndWait:] 15 PhotoLibraryServices 0x1738c8 -[PLManagedObjectContext performBlockAndWait:] 16 Photos 0x147d30 _presentFullResourceAtIndex 17 Photos 0x1476bc PHChooserListContinueEnumerating 18 Photos 0x1445e0 -[PHImageResourceChooser presentNextQualifyingResource] 19 Photos 0x2ea74 -[PHImageRequest startRequest] 20 Photos 0x3f2c0 -[PHMediaRequestContext _registerAndStartRequests:] 21 Photos 0x3e484 -[PHMediaRequestContext start] 22 Photos 0x1f0710 -[PHImageManager runRequestWithContext:] 23 Photos 0x1efdb0 -[PHImageManager requestImageDataAndOrientationForAsset:options:resultHandler:] 24 TeraBox 0x2497f0c closure #1 in LocalPhotoLibManager.getDataFrom(_:_:) + 549 (LocalPhotoLibManager.swift:549) 25 TeraBox 0x1835fc4 thunk for @escaping @callee_guaranteed () -> () (<compiler-generated>) 26 TeraBox 0x1cb1288 +[DuboxOCException tryOC:catchException:] + 18 (DuboxOCException.m:18) 27 TeraBox 0x249b4d4 specialized LocalPhotoLibManager.convert(with:_:) + 548 (LocalPhotoLibManager.swift:548) 28 TeraBox 0x2493b24 closure #1 in closure #1 in closure #1 in LocalPhotoLibManager.scanAlbumUpdateLocalphotoTable(_:) + 173 (LocalPhotoLibManager.swift:173) 29 TeraBox 0x1835fc4 thunk for @escaping @callee_guaranteed () -> () (<compiler-generated>) 30 libdispatch.dylib 0x26a8 _dispatch_call_block_and_release 31 libdispatch.dylib 0x4300 _dispatch_client_callout 32 libdispatch.dylib 0x744c _dispatch_queue_override_invoke 33 libdispatch.dylib 0x15be4 _dispatch_root_queue_drain 34 libdispatch.dylib 0x163ec _dispatch_worker_thread2 35 libsystem_pthread.dylib 0x1928 _pthread_wqthread 36 libsystem_pthread.dylib 0x1a04 start_wqthread
6
5
1.5k
Nov ’23
Crash in photolibraryd Process During Import in Photos.app & PHAsset.fetchAssets() Call
Problem: While calling PHAsset.fetchAssets() and iterating over its results, if the Photos.app is simultaneously running an import operation (File | Import), the photolibraryd process crashes. I have already flagged this issue to Apple (FB13178379) but wanted to check if anyone else has encountered this behavior. Steps to Reproduce: Initiate an import in the Photos.app. Run the following code snippet in the Playground: import Photos PHPhotoLibrary.requestAuthorization(for: .readWrite) { authorizationStatus in guard authorizationStatus == .authorized else { return } let fetchResult = PHAsset.fetchAssets(with: nil) print(fetchResult) for i in 0..<fetchResult.count { print(fetchResult[i]) } } Upon doing this, I consistently receive the error: Connection to assetsd was interrupted - photolibraryd exited, died, or closed the photo library in the Console, causing my code to hang. Environment: macOS Version: 13.5.2 (22G91) Xcode Version: Version 14.3.1 (14E300c) Additional Info: After the crash, a report pops up in the Console, and typically, the Photos.app import operation freezes too. I've noticed that after terminating all processes and/or rebooting, the Photos.app displays "Restoring from iCloud…" and the recovery process lasts overnight. Seeking Suggestions: I'm exploring potential workarounds for this issue. I've attempted to use fetchLimit to obtain assets in batches, but the problem persists. Is there a method to detect that the Photos.app is executing an import, allowing my process to wait until completion? Alternatively, can I catch the photolibraryd crash and delay until it's restored? I'm operating in a batch processing mode for photos, so pausing and retrying later in the event of a Photos.app import isn't an issue. Any guidance or shared experiences would be greatly appreciated! Cheers and happy coding!
1
0
511
Sep ’23
Creation of live photos from existing images and video files
The new live photos format works in iOS 17 with captures from camera only. We need an API or methods to generate live photos from existing images and videos without capturing new ones. In the past this was possible by matching a .jpg and a .mov with same identifier but now the PhotoData format is not documented, or exactly the way we can combine an image or series of images with a video file and convert them to live photo outside an AVCaptureSession. It is indeed an intentional action for live photo and new PhotoKit to let users use only camera captured sessions , or will be some method to allow users use their existing movies and generate live photos on iOS117?
0
1
664
Sep ’23
Photos App: Re-indexing of edited photos
I would like to use a third-party app to edit the metadata of a photo to change its Caption and then be able to search in the Photos app to find that image with the edited caption. I have managed to do this by duplicating the photo with the edited metadata. The Photos app recognizes it as a new photo and indexes it with the new caption, making it searchable. However, when editing the photo in-place, the Photos app will not re-index the photo, therefore it will not be searchable. Is there a way to edit photos in-place and have them searchable with the new metadata?
1
0
641
Sep ’23
Understanding PHPickerConfiguration.AssetRepresentationMode.current
The documentation for this API mentions: The system uses the current representation and avoids transcoding, if possible. What are the scenarios in which transcoding takes place? The reason for asking is that we've had a user reaching out saying they selected a video file from their Photos app, which resulted in a decrease in size from ~110MB to 35MB. We find it unlikely it's transcoding-related, but we want to gain more insights into the possible scenarios.
3
1
636
Sep ’23
How to detect screenshots on iOS 17 beta?
I am able to detect screenshot only after screenshot is taken. UIApplication.userDidTakeScreenshotNotification. But my requirement is to detect at the same time while user is taking screenshot, so that I can hide my confidential data. Please comment out if there are any better solutions or any callback functions available.
1
0
1.1k
Aug ’23