Post not yet marked as solved
Running on:
iMac 27" 5k late 2015 - 64gb ram and a 16tb Pegasus Promise2 R4 raid5 via Thunderbolt.
After trying Big Sur - found issues with Luminar Photo app, decided to return to Catalina on the iMac.
Reformatted my internal drive and reinstalled Catalina 15.5 and reformatted the raid. But I keep getting the following message upon restarting:
"Incompatible Disk. This disk uses features that are not supported on this version of MacOS" and my Pegasus2 R4 portion no longer appears on the desktop or in Disk Utility...
Looked into this and discovered that it may be an issue of Mac OS Extended vs APFS
The iMac was formatted to APFS prior to installing OS11 so I reformatted to APFS when returning to Catalina. The issues persisted so I re-reformatted from a bootable USB - this time to Mac OS Extended (journaled) and the issues seems to be resolved.
The iMac runs slower on MacOS Ext, but it is running and the Raid is recognised...
I'd love to go back to APFS but am afraid it will "break" things.
Any thought on this would be welcome.
Thanks
Nick
Post not yet marked as solved
Hello!
I am playing around with the PHPickerViewController and so far I was able to get the selected images by loading them into UIImage instances but I don't know how to get the selected video.
Below is the relevant implementation of the method: func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]):
let provider = result.itemProvider
guard provider.hasItemConformingToTypeIdentifier(AVFileType.mov.rawValue) else { return }
						provider.loadItem(forTypeIdentifier: AVFileType.mov.rawValue, options: nil) { (fileURL, error) in
								if let error = error {
										print(error)
										return
								}
								guard let videoURL = fileURL as? URL else { return }
								DispatchQueue.main.async {
										let fm = FileManager.default
										let destination = fm.temporaryDirectory.appendingPathComponent("video123.mov")
										try! fm.copyItem(at: videoURL, to: destination)
										let playerVC = AVPlayerViewController()
										playerVC.player = AVPlayer(url: destination)
										self.present(playerVC, animated: true, completion: nil)
								}
						}
I get crash trying to copy the item. It says the source file does not exists but the path looks real to me.
"The file “3C2BCCBC-4474-491B-90C2-93DF848AADF5.mov” couldn’t be opened because there is no such file." I tried it without copying first and just passing the URL to AVPlayer but nothing would play.
I am testing this on a simulator.
Thanks for help!
Post not yet marked as solved
Can't save gifs, it just saves frame of it as a photo.
Post not yet marked as solved
Hi, I'm wondering if it is possible to manipulate the "other" Photos library with PhotoKit, not the only System one.
PhotoKit documentation only mentioned object of PHPhotoLibrary class, which is "a shared object that manages access and changes to the user’s shared photo library". But the Photos app in macOS allows to create and switch between multiple photo libraries.
Thanks.
Post not yet marked as solved
I use the following code to parse Photo metadata and this works well. However, I am unable to pull the new iOS 14 "caption" from this metadata (it worked in early iOS 14 betas, but has since stopped working in the GM.) Does anyone know how I can get the caption data from a PHAsset? Thanks!
Stephen
let options = PHContentEditingInputRequestOptions()
options.isNetworkAccessAllowed = true
asset.requestContentEditingInput(with: options, completionHandler: {(contentEditingInput, _) -> Void in
if let url = contentEditingInput?.fullSizeImageURL {
let fullImage = CIImage(contentsOf: url)
// get all the metadata
self.allPhotoMetadata = fullImage?.properties ?? [:]
// {TIFF}
if let tiffDict = self.allPhotoMetadata["{TIFF}"] as? [String:Any] {
if tiffDict["Make"] != nil {
self.cameraData[cameraKeys.make] = tiffDict["Make"]
}
if tiffDict["Model"] != nil {
self.cameraData[cameraKeys.model] = tiffDict["Model"]
}
if tiffDict["ImageDescription"] != nil {
self.imageData[imageKeys.caption] = tiffDict["ImageDescription"]
}
}
// {IPTC}
if let iptcDict = self.allPhotoMetadata["{IPTC}"] as? [String:Any] {
// if we didn't find a caption in the TIFF dict, try to get it from IPTC data
// first try, Caption/Abtract, then ArtworkContentDescription
if self.imageData[imageKeys.caption] == nil {
if iptcDict["Caption/Abstract"] != nil {
self.imageData[imageKeys.caption] = iptcDict["ArtworkContentDescription"]
} else if iptcDict["ArtworkContentDescription"] != nil {
self.imageData[imageKeys.caption] = iptcDict["ArtworkContentDescription"]
}
}
}
}
})
}
Post not yet marked as solved
I have been unable to capture Live Photos using UIImagePickerController. I can capture still photos and even video (which is not my scenario but I checked just to make sure), but the camera does not capture live photos. The documentation suggests it should (source):
To obtain the motion and sound content of a live photo for display (using the PHLivePhotoView class), include the kUTTypeImage and kUTTypeLivePhoto identifiers in the allowed media types when configuring an image picker controller. When the user picks or captures a Live Photo, the editingInfo dictionary contains the livePhoto key, with a PHLivePhoto representation of the photo as the corresponding value.
I've set up my controller:
let camera = UIImagePickerController()
camera.sourceType = .camera
camera.mediaTypes = [UTType.image.identifier, UTType.livePhoto.identifier]
camera.delegate = context.coordinator
In the delegate I check for the Live Photo:
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {
if let live = info[.livePhoto] as? PHLivePhoto {
// handle live photo
} else if let takenImage = info[.originalImage] as? UIImage, let metadata = info[.mediaMetadata] as? [AnyHashable:Any] {
// handle still photo
}
}
But I never get the Live Photo.
I've tried adding NSMicrophoneUsageDescription to the info.plist thinking it needs permissions for the microphone, but that did not help. Of course, I've added the NSCameraUsageDescription to give camera permissions.
Has anyone successfully captured Live Photos using UIImagePickerController?
Post not yet marked as solved
Environment: iOS 16 beta 2, beta 3. iPhone 11 Pro, 12 mini
Steps to reproduce:
Subscribe to Photo Library changes via PHPhotoLibraryChangeObserver, put some logs to track inserted/deleted objects:
func photoLibraryDidChange(_ changeInstance: PHChange) {
if let changeDetails = changes.changeDetails(for: allPhotosFetchResult) {
for insertion in changeDetails.insertedObjects {
print("🥶 INSERTED: ", insertion.localIdentifier)
}
for deletion in changeDetails.removedObjects {
print("🥶 DELETED: ", deletion.localIdentifier)
}
}
}
Save a photo to camera roll with PHAssetCreationRequest
Go to the Photo Library, delete the newly saved photo
Come back to the app and watch the logs:
🥶 INSERTED: 903933C3-7B83-4212-8DF1-37C2AD3A923D/L0/001
🥶 DELETED: 39F673E7-C5AC-422C-8BAA-1BF865120BBF/L0/001
Expected result: localIdentifier of the saved and deleted asset is the same string in both logs.
In fact: It's different.
So it appears that either the localIdentifier of an asset gets changed after successful saving, or it's a bug in the Photos framework in iOS 16. I've checked - in iOS 15 it works fine (IDs in logs match).
Since iOS16 introduced the shared albumn, is there an API for developers to access the assets inside that shared albumn?
Post not yet marked as solved
Is this accessible from Swift directly?
Visual Look Up
Lift subject from background
Lift the subject from an image or isolate the subject by removing the background. This works in Photos, Screenshot, Quick Look, Safari, and more.
Source: macOS Ventura Preview - New Features - Apple
I see that Shortcuts now has a native Remove Background command that wasn't there in iOS 25 or MacOS 12. Is there any way to call that from Swift besides x-callback url schemes?
Post not yet marked as solved
We observed that the PHPicker is unable to load RAW images captured on an iPhone in some scenarios. And it is also somehow related to iCloud.
Here is the setup:
The PHPickerViewController is configured with preferredAssetRepresentationMode = .current to avoid transcoding.
The image is loaded from the item provider like this:
if itemProvider.hasItemConformingToTypeIdentifier(kUTTypeImage) {
itemProvider.loadFileRepresentation(forTypeIdentifier: kUTTypeImage) { url, error in
// work
}
}
This usually works, also for RAW images. However, when trying to load a RAW image that has just been captured with the iPhone, the loading fails with the following errors on the console:
[claims] 43A5D3B2-84CD-488D-B9E4-19F9ED5F39EB grantAccessClaim reply is an error: Error Domain=NSCocoaErrorDomain Code=4097 "Couldn’t communicate with a helper application." UserInfo={NSUnderlyingError=0x2804a8e70 {Error Domain=NSCocoaErrorDomain Code=4097 "connection from pid 19420 on anonymousListener or serviceListener" UserInfo={NSDebugDescription=connection from pid 19420 on anonymousListener or serviceListener}}}
Error copying file type public.image. Error: Error Domain=NSItemProviderErrorDomain Code=-1000 "Cannot load representation of type public.image" UserInfo={NSLocalizedDescription=Cannot load representation of type public.image, NSUnderlyingError=0x280480540 {Error Domain=NSCocoaErrorDomain Code=4097 "Couldn’t communicate with a helper application." UserInfo={NSUnderlyingError=0x2804a8e70 {Error Domain=NSCocoaErrorDomain Code=4097 "connection from pid 19420 on anonymousListener or serviceListener" UserInfo={NSDebugDescription=connection from pid 19420 on anonymousListener or serviceListener}}}}}
We observed that on some devices, loading the image will actually work after a short time (~30 sec), but on others it will always fail.
We think it is related to iCloud Photos: On the device that has iCloud Photos sync enabled, the picker is able to load the image right after it was synced to the cloud. On devices that don't sync the image, loading always fails. It seems that the sync process is doing some processing (?) of the image that will later enable the picker to load it successfully, but that's just guessing.
Additional observations:
This seems to only occur for images that were taken with the stock Camera app. When using Halide to capture RAW (either ProRAW or RAW), the Picker is able to load the image.
When trying to load the image as kUTTypeRawImage instead of kUTTypeImage, it also fails.
The picker also can't load RAW images that were AirDroped from another device, unless it synced to iCloud first.
This is reproducable using the Selecting Photos and Videos in iOS sample code project.
We observed this happening in other apps that use the PHPicker, not just ours.
Is this a bug, or is there something that we are missing?
Post not yet marked as solved
When I used
BOOL compatible = UIVideoAtPathIsCompatibleWithSavedPhotosAlbum([pathUrl path]);
if (compatible) {
UISaveVideoAtPathToSavedPhotosAlbum([pathUrl path], self, @selector(savedPhotoImage:didFinishSavingWithError:contextInfo:), nil);
}
to save video,but here is some error.
My phone has 33.72GB available and the video is only 4.2GB in size.
What should i do?
ps: I try to use [PHPhotoLibrary sharedPhotoLibrary] performChanges to save, but i got the same error. error code -3305.
When capturing RAW (not ProRAW) photos using AVCapturePhotoOutput, the resulting images are subject to a strange overexposed effect when viewed in Apple software. I have been able to recreate this in multiple iOS apps which allow RAW capture.
Some users report previously normal images transforming over the span of a few minutes. I have actually watched this happen in real-time: if you observe the camera roll after taking a few RAW photos, the highlights in some will randomly **** (edit: this just says b l o w, nice job profanity filter) out of proportion after whatever is causing this issue kicks in.
The issue can also be triggered by zooming in to one of these images from the stock Photos app. Once the overexposure happens on a given image, there doesn't appear to be a way to get it to display normally again. However, if you AirDrop an image to a different device and then back, you can see it display normally at first and then break once more.
Interestingly, the photo displays completely fine when viewed in Affinity photo or random photo viewers on Ubuntu.
Sometimes the issue is not that bad, but it is often egregious, resulting in completely white areas of a previously balanced photo (see https://discussions.apple.com/thread/254424489).
This definitely seems like a bug, but is there any way to prevent it? Could there be an issue with color profiles?
This is not the same issue in which users think RAW photos are broken because they are viewing the associated JPG – this happens even with photos that have no embedded JPG or HEIC preview.
Very similar (supposedly fixed) issue on MacOS:
https://www.macworld.com/article/1444389/overexposed-raw-image-export-macos-monterey-photos-fixed.html
Numerous similar complaints:
https://discussions.apple.com/thread/254424489
https://discussions.apple.com/thread/253179308
https://discussions.apple.com/thread/253773180
https://discussions.apple.com/thread/253954972
https://discussions.apple.com/thread/252354516
Post not yet marked as solved
Bracketed photo capture with iPhone 14 Pro produces photos with oddly clipped highlights. Rather than pixel values clipping normally to white when too bright, there is a sudden and harsh jump from gray to white. See image below that shows a comparison of a normal photo capture (using qualityPriorization .speed) and a bracketed capture. Note the massive difference in clipping behavior.
Here's how I am configuring the bracketed capture:
let bracketedStillImageSettings = AVCaptureAutoExposureBracketedStillImageSettings.autoExposureSettings(exposureTargetBias: 0.0)
let bracketSettings = AVCapturePhotoBracketSettings(rawPixelFormatType: 0 as OSType, processedFormat: [AVVideoCodecKey: AVVideoCodecType.jpeg], bracketedSettings: [bracketedStillImageSettings])
photoOutput.capturePhoto(with: bracketSettings, delegate: photoCaptureProcessor)
Things I've tried that don't make a difference:
Changing AVCapturePhotoBracketSettings.photoQualityPrioritization to values of .speed, .balanced, or .quality
Changing device type .builtInTripleCamera, .builtInUltraWideCamera, .builtInWideCamera, .builtInDualCamera, etc
Changing capture format from jpeg to hevc
Any ideas?
I use PHPicker for user to import photos, but UIImage not support the pic of .AVIF, so I want to get the origin data of .AVIF pic, this is my code:
func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) {
picker.dismiss(animated: true)
for image in results {
loadImagePickerResult(image: image)
}
}
func loadImagePickerResult(image: PHPickerResult) {
if image.itemProvider.canLoadObject(ofClass: UIImage.self) {
image.itemProvider.loadObject(ofClass: UIImage.self) { [weak self] newImage, error in
guard let self = self else { return }
if let _ = error {
} else if let needAddImage = newImage as? UIImage {
let imageItem = ContentImageItem()
imageItem.image = needAddImage
self.viewModel.selectedImageList.append(imageItem)
DispatchQueue.main.async {
self.scrollImageView.reloadData()
self.checkConfirmState()
}
}
}
} else if image.itemProvider.hasItemConformingToTypeIdentifier(kUTTypeImage as String) {
image.itemProvider.loadItem(forTypeIdentifier: kUTTypeImage as String, options: nil) { [weak self] item, error in
guard let self = self else { return }
guard let url = item as? URL else { return }
var imageData: Data?
do {
imageData = try Data(contentsOf: url, options: [.mappedIfSafe, .alwaysMapped])
} catch {
}
guard let selectedImageData = imageData else { return }
/// selectedImageData is empty data
}
} else {
}
}
When I choose .AVIF pic, itemProvider can load the image by "kUTTypeImage" typeIdentifier, and success to get the local path of the pic, but when I use Data(contentsOf: ) to read the origin data, I can only get an empty data. So, is there any problem with this code?Does anyone have experience in handling this matter?
"FileManager.default.contents(atPath: url.path)" and "NSData(contentsOf" is also return empty Data
Post not yet marked as solved
item.loadTransferable(type: Data.self) { result in
switch result {
case .success(let data):
guard let data = data, let image = UIImage(data: data) else { break }
imageView.image = image
case .failure(let error):
...
}
}
I load the raw format photo through the above code, but the displayed image is very blurry and messed up.
But not all raw format photos will be like this, what is the reason?
Post not yet marked as solved
I have been using the photogrammetry for some time with 1 request but I am needing to scale to multiple. For a session you must give it a folder path where the images live and then you can add multiple requests on the same set of images.
My question is if I have multiple folders with different images can I batch them all together and run them at once? Right now I only see a way to set the session to Folder1's path.
Folder1 - Images
Folder2- Images
Folder3 - Images
etc
Thanks!
Post not yet marked as solved
I have my callback registered and it's called repeatedly without anything changing in the photo album. I'm getting the assets via
PHAssetCollection.fetchAssetCollections
In the callback, I update my PHFetchResult and PHCollection to the one passed in the callback. I process the changeDetails
yet the callback keeps getting called.
Any ideas?
Post not yet marked as solved
My app getting rejected from testflight due to photolibraryusagedescription when i am not even using this permission and define the purpose for this permission as per app rejection mail. BTW i am a flutter developer and my app is build in flutter.
I have removed every package which might be using this permission.
Done the flutter clean and pod update and everything.
This is my Podfile.
platform :ios, '12.0'
# CocoaPods analytics sends network stats synchronously affecting flutter build latency.
ENV['COCOAPODS_DISABLE_STATS'] = 'true'
project 'Runner', {
'Debug' => :debug,
'Profile' => :release,
'Release' => :release,
}
install! 'cocoapods', :disable_input_output_paths => true
def flutter_root
generated_xcode_build_settings_path = File.expand_path(File.join('..', 'Flutter', 'Generated.xcconfig'), __FILE__)
unless File.exist?(generated_xcode_build_settings_path)
raise "#{generated_xcode_build_settings_path} must exist. If you're running pod install manually, make sure flutter pub get is executed first"
end
File.foreach(generated_xcode_build_settings_path) do |line|
matches = line.match(/FLUTTER_ROOT\=(.*)/)
return matches[1].strip if matches
end
raise "FLUTTER_ROOT not found in #{generated_xcode_build_settings_path}. Try deleting Generated.xcconfig, then run flutter pub get"
end
require File.expand_path(File.join('packages', 'flutter_tools', 'bin', 'podhelper'), flutter_root)
flutter_ios_podfile_setup
# target 'Runner' do
# '
#
# # ...
# end
target 'Runner' do
use_frameworks!
use_modular_headers!
pod 'Firebase'
pod 'GoogleUtilities'
# pod 'cloud_firestore', :git=> 'https://github.com/invertase/firestore-ios-sdk-frameworks.git', :tag => '10.3.0'
pod 'Firebase/Core', '>= 0'
pod 'FirebaseCrashlytics', '>= 0'
flutter_install_all_ios_pods File.dirname(File.realpath(__FILE__))
end
post_install do |installer|
installer.generated_projects.each do |project|
project.targets.each do |target|
target.build_configurations.each do |config|
config.build_settings['IPHONEOS_DEPLOYMENT_TARGET'] = '13.0'
end
end
end
installer.pods_project.targets.each do |target|
flutter_additional_ios_build_settings(target)
end
end
target 'ImageNotification' do
use_frameworks!
pod 'GoogleUtilities'
pod 'Firebase/Messaging','>=10.0.0'
end
This is my Info.plist file
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>BGTaskSchedulerPermittedIdentifiers</key>
<array>
<string>dev.flutter.background.refresh</string>
</array>
<key>CADisableMinimumFrameDurationOnPhone</key>
<true/>
<key>CFBundleDevelopmentRegion</key>
<string>$(DEVELOPMENT_LANGUAGE)</string>
<key>CFBundleDisplayName</key>
<string>THE One</string>
<key>CFBundleExecutable</key>
<string>$(EXECUTABLE_NAME)</string>
<key>CFBundleIdentifier</key>
<string>$(PRODUCT_BUNDLE_IDENTIFIER)</string>
<key>CFBundleInfoDictionaryVersion</key>
<string>6.0</string>
<key>CFBundleName</key>
<string>ios</string>
<key>CFBundlePackageType</key>
<string>APPL</string>
<key>CFBundleShortVersionString</key>
<string>$(MARKETING_VERSION)</string>
<key>CFBundleSignature</key>
<string>????</string>
<key>CFBundleURLTypes</key>
<array>
<dict>
<key>CFBundleURLSchemes</key>
<array>
<string>fb485115409812800</string>
</array>
</dict>
</array>
<key>CFBundleVersion</key>
<string>$(CURRENT_PROJECT_VERSION)</string>
<key>FacebookAdvertiserIDCollectionEnabled</key>
<true/>
<key>FacebookAppID</key>
<string>485115409812800</string>
<key>FacebookAutoLogAppEventsEnabled</key>
<true/>
<key>FacebookClientToken</key>
<string>66dae4870d542eed4c6f948fc39ce9d9</string>
<key>FacebookDisplayName</key>
<string>THE One Mobile App Matrid</string>
<key>FirebaseAppDelegateProxyEnable</key>
<false/>
<key>LSApplicationCategoryType</key>
<string></string>
<key>LSApplicationQueriesSchemes</key>
<array>
<string>https</string>
<string>http</string>
</array>
<key>LSRequiresIPhoneOS</key>
<true/>
<key>NSAppTransportSecurity</key>
<dict>
<key>NSAllowsArbitraryLoads</key>
<true/>
<key>NSAllowsArbitraryLoadsInWebContent</key>
<true/>
</dict>
<key>NSLocationAlwaysUsageDescription</key>
<string>This app needs access to location for selecting the products from the nearest THE One store. This will help us to deliver personalized and best products and services to you. Your data will be safe and not shared with a third party. You can disable it from Settings.</string>
<key>NSLocationWhenInUseUsageDescription</key>
<string>This app needs access to location for selecting the nearest THE One store. This will help us to deliver personalized and best products and services to you. Your data will be safe and not shared with a third party. You can disable it from Settings</string>
<key>NSMicrophoneUsageDescription</key>
<string>This app does not required Microphone permission</string>
<key>NSUserTrackingUsageDescription</key>
<string>This will help us to deliver personalized and best products and services to you. Your data will be safe and not shared with a third party. You can disable it from Settings</string>
<key>NSLocationAlwaysAndWhenInUseUsageDescription</key>
<string>This app needs access to location for selecting the products from the nearest THE One store but this permission is optional, you can change it from app setting. This will help us to deliver personalized and best products and services to you. Your data will be safe and not shared with a third party. You can disable it from Settings.</string>
<key>NSPhotoLibraryAddUsageDescription</key>
<string>This app does not required Photo Library Permission</string>
<key>UIViewControllerBasedStatusBarAppearance</key>
<false/>
<key>io.flutter.embedded_views_preview</key>
<true/>
<key>UIStatusBarHidden</key>
<false/>
</dict>
</plist>
Some String has been removed from this post due to word limitation.
Every help appreciated.
Post not yet marked as solved
I understand that Apple fully implemented the ITU-R BT.2100 PQ and HLG transfer types for various image formats. I’m working on a project where I have a dng file as an input and I need to save the modified file as heif. Which transfer method should I choose? Does it matter?
Post not yet marked as solved
I was trying to take an image of a car that includes the license plate. But later when I checked the numbers in the license plate is not clear and readable. When I did a zoom there are slanting lines in place. Is there any settings available to avoid?