Goal is to get/save the captured photo (From default camera) immediately from my app when the app is in background.
When I capture a photo with default camera, photoLibraryDidChange(_:) function do not execute that time. But when I reopen my app, that time this function executes and give the images, which were captured in that particular time.
how to execute photoLibraryDidChange(_:) when app is in background?
Photos and Imaging
RSS for tagIntegrate still images and other forms of photography into your apps.
Posts under Photos and Imaging tag
71 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
hii developers currently i developing a ios camera app in that camera app i need to add features like photographic styles in ios 13 i need only that page view not filters this is my big problem..i used uipageviewcontroller and swipe gesture if i use page in background main camera view func also run, i used one button if i press the button i need views like photographic styles view just view this is my problem i can't do that so please if anyone can read this comment please and solve ..thanks in advance
I am trying to locate information or documentation on how to pull in photos from the iCloud Shared Albums, but have not been able to find anything yet. Dakboard is currently doing it so I know it is possible, but I cannot find an API or any documentation covering how to access the photos in a Shared Album for incorporation into web applications. Can anyone help?
With AVFoundation, how do you set up the new 24MP capture on new iPhone 15 models?
I strongly believed it was in videodevice.activeFormat.supportedMaxPhotoDimensions array, but not.
The latest iPhone 15 Pro models support additional focal lengths on the main 24mm (1x) lens: 28mm ("1.2x") and 35mm ("1.5x"). These are supposed to use data from the full sensor to achieve optical quality images (i.e. no upscaling), so I would expect these new focal lengths to appear in the secondaryNativeResolutionZoomFactors array, just like the 2x option does. However, the activeFormat.secondaryNativeResolutionZoomFactors property still only reports [2.0] when using the main 1x lens. Is this an oversight, or is there something special (other than setting the zoom factor) we need to do to access the high-quality 28mm and 35mm modes? I'm wary of simply setting 1.2 or 1.5 as the zoom factor, as that isn't truly the ratio between the base 24mm and the virtual focal lengths.
I am trying to display HDR Images (ProRAW) within UIImageView using preferredImageDynamicRange. This was shown in a 2023 WWDC Video
let imageView = UIImageView()
if #available(iOS 17.0, *) {
self.imageView.preferredImageDynamicRange = UIImage.DynamicRange.high
}
self.imageView.clipsToBounds = true
self.imageView.isMultipleTouchEnabled = true
self.imageView.contentMode = .scaleAspectFit
self.photoScrollView.addSubview(self.imageView)
I pull the image from PHImageManager:
let options = PHImageRequestOptions()
options.deliveryMode = .highQualityFormat
options.isNetworkAccessAllowed = true
PHImageManager.default().requestImage(for: asset, targetSize: self.targetSize(), contentMode: .aspectFit, options: options, resultHandler: { image, info in
guard let image = image else {
return
}
DispatchQueue.main.async {
self.imageView.image =image
if #available(iOS 17.0, *) {
self.imageView.preferredImageDynamicRange = UIImage.DynamicRange.high
}
}
}
Issue
The image shows successfully, yet not in HDR mode (no bright specular highlights, as seen when the same image ((ProRAW) is pulled on the native camera app.
What am I missing here?
Unhandled error (NSCocoaErrorDomain, 134093) occurred during faulting and was thrown: Error Domain=NSCocoaErrorDomain Code=134093 "(null)"
Fatal Exception: NSInternalInconsistencyException
0 CoreFoundation 0xed5e0 __exceptionPreprocess
1 libobjc.A.dylib 0x2bc00 objc_exception_throw
2 CoreData 0x129c8 _PFFaultHandlerLookupRow
3 CoreData 0x11d60 _PF_FulfillDeferredFault
4 CoreData 0x11c58 _pvfk_header
5 CoreData 0x98e64 _sharedIMPL_pvfk_core_c
6 PhotoLibraryServices 0x6d8b0 -[PLInternalResource orientation]
7 PhotoLibraryServices 0x6d7bc -[PLInternalResource orientedWidth]
8 Photos 0x147e74 ___presentFullResourceAtIndex_block_invoke
9 PhotoLibraryServices 0x174ee4 __53-[PLManagedObjectContext _directPerformBlockAndWait:]_block_invoke
10 CoreData 0x208ec developerSubmittedBlockToNSManagedObjectContextPerform
11 libdispatch.dylib 0x4300 _dispatch_client_callout
12 libdispatch.dylib 0x136b4 _dispatch_lane_barrier_sync_invoke_and_complete
13 CoreData 0x207f8 -[NSManagedObjectContext performBlockAndWait:]
14 PhotoLibraryServices 0x174e98 -[PLManagedObjectContext _directPerformBlockAndWait:]
15 PhotoLibraryServices 0x1738c8 -[PLManagedObjectContext performBlockAndWait:]
16 Photos 0x147d30 _presentFullResourceAtIndex
17 Photos 0x1476bc PHChooserListContinueEnumerating
18 Photos 0x1445e0 -[PHImageResourceChooser presentNextQualifyingResource]
19 Photos 0x2ea74 -[PHImageRequest startRequest]
20 Photos 0x3f2c0 -[PHMediaRequestContext _registerAndStartRequests:]
21 Photos 0x3e484 -[PHMediaRequestContext start]
22 Photos 0x1f0710 -[PHImageManager runRequestWithContext:]
23 Photos 0x1efdb0 -[PHImageManager requestImageDataAndOrientationForAsset:options:resultHandler:]
24 TeraBox 0x2497f0c closure #1 in LocalPhotoLibManager.getDataFrom(_:_:) + 549 (LocalPhotoLibManager.swift:549)
25 TeraBox 0x1835fc4 thunk for @escaping @callee_guaranteed () -> () (<compiler-generated>)
26 TeraBox 0x1cb1288 +[DuboxOCException tryOC:catchException:] + 18 (DuboxOCException.m:18)
27 TeraBox 0x249b4d4 specialized LocalPhotoLibManager.convert(with:_:) + 548 (LocalPhotoLibManager.swift:548)
28 TeraBox 0x2493b24 closure #1 in closure #1 in closure #1 in LocalPhotoLibManager.scanAlbumUpdateLocalphotoTable(_:) + 173 (LocalPhotoLibManager.swift:173)
29 TeraBox 0x1835fc4 thunk for @escaping @callee_guaranteed () -> () (<compiler-generated>)
30 libdispatch.dylib 0x26a8 _dispatch_call_block_and_release
31 libdispatch.dylib 0x4300 _dispatch_client_callout
32 libdispatch.dylib 0x744c _dispatch_queue_override_invoke
33 libdispatch.dylib 0x15be4 _dispatch_root_queue_drain
34 libdispatch.dylib 0x163ec _dispatch_worker_thread2
35 libsystem_pthread.dylib 0x1928 _pthread_wqthread
36 libsystem_pthread.dylib 0x1a04 start_wqthread
In my app I get a UIImage for a PHAsset via PHImageManager.requestImage(for:targetSize:contentMode:options:resultHandler:). I directly display that image in a UIImageView that has preferredImageDynamicRange set to .high. The problem is I do not see the high dynamic range.
I see the HDRDemo23 sample code uses PhotosPicker to get a UIImage from Data through UIImageReader whose config enables prefersHighDynamicRange.
Is there a way to support HDR when using the Photos APIs to request display images?
And is there support for PHLivePhoto displayed in PHLivePhotoView retrieved via PHImageManager.requestLivePhoto?
Environment: iOS 16 beta 2, beta 3. iPhone 11 Pro, 12 mini
Steps to reproduce:
Subscribe to Photo Library changes via PHPhotoLibraryChangeObserver, put some logs to track inserted/deleted objects:
func photoLibraryDidChange(_ changeInstance: PHChange) {
if let changeDetails = changes.changeDetails(for: allPhotosFetchResult) {
for insertion in changeDetails.insertedObjects {
print("🥶 INSERTED: ", insertion.localIdentifier)
}
for deletion in changeDetails.removedObjects {
print("🥶 DELETED: ", deletion.localIdentifier)
}
}
}
Save a photo to camera roll with PHAssetCreationRequest
Go to the Photo Library, delete the newly saved photo
Come back to the app and watch the logs:
🥶 INSERTED: 903933C3-7B83-4212-8DF1-37C2AD3A923D/L0/001
🥶 DELETED: 39F673E7-C5AC-422C-8BAA-1BF865120BBF/L0/001
Expected result: localIdentifier of the saved and deleted asset is the same string in both logs.
In fact: It's different.
So it appears that either the localIdentifier of an asset gets changed after successful saving, or it's a bug in the Photos framework in iOS 16. I've checked - in iOS 15 it works fine (IDs in logs match).
Hello!
I am playing around with the PHPickerViewController and so far I was able to get the selected images by loading them into UIImage instances but I don't know how to get the selected video.
Below is the relevant implementation of the method: func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]):
let provider = result.itemProvider
guard provider.hasItemConformingToTypeIdentifier(AVFileType.mov.rawValue) else { return }
						provider.loadItem(forTypeIdentifier: AVFileType.mov.rawValue, options: nil) { (fileURL, error) in
								if let error = error {
										print(error)
										return
								}
								guard let videoURL = fileURL as? URL else { return }
								DispatchQueue.main.async {
										let fm = FileManager.default
										let destination = fm.temporaryDirectory.appendingPathComponent("video123.mov")
										try! fm.copyItem(at: videoURL, to: destination)
										let playerVC = AVPlayerViewController()
										playerVC.player = AVPlayer(url: destination)
										self.present(playerVC, animated: true, completion: nil)
								}
						}
I get crash trying to copy the item. It says the source file does not exists but the path looks real to me.
"The file “3C2BCCBC-4474-491B-90C2-93DF848AADF5.mov” couldn’t be opened because there is no such file." I tried it without copying first and just passing the URL to AVPlayer but nothing would play.
I am testing this on a simulator.
Thanks for help!
Running on:
iMac 27" 5k late 2015 - 64gb ram and a 16tb Pegasus Promise2 R4 raid5 via Thunderbolt.
After trying Big Sur - found issues with Luminar Photo app, decided to return to Catalina on the iMac.
Reformatted my internal drive and reinstalled Catalina 15.5 and reformatted the raid. But I keep getting the following message upon restarting:
"Incompatible Disk. This disk uses features that are not supported on this version of MacOS" and my Pegasus2 R4 portion no longer appears on the desktop or in Disk Utility...
Looked into this and discovered that it may be an issue of Mac OS Extended vs APFS
The iMac was formatted to APFS prior to installing OS11 so I reformatted to APFS when returning to Catalina. The issues persisted so I re-reformatted from a bootable USB - this time to Mac OS Extended (journaled) and the issues seems to be resolved.
The iMac runs slower on MacOS Ext, but it is running and the Raid is recognised...
I'd love to go back to APFS but am afraid it will "break" things.
Any thought on this would be welcome.
Thanks
Nick