Post

Replies

Boosts

Views

Activity

PHLivePhotoEditingContext.saveLivePhoto results in AVFoundation error -11800 "The operation could not be completed" reason An unknown error occurred (-12815)
When trying to edit some Live Photos, calling PHLivePhotoEditingContext.saveLivePhoto results in the following error: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12815), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x300d05380 {Error Domain=NSOSStatusErrorDomain Code=-12815 "(null)"}} I was able to replicate it on my device by taking a new Live Photo. Not sure what's wrong with that one specifically, not all Live Photos replicate the issue. I've submitted FB15880825 with a sysdiagnose and a Photos Diagnostics as well. Any ideas what's going on here? It's impacting multiple customers. Thanks!
0
0
22
2h
How to change PHLivePhoto EXIF metadata
I have an app that allows the user to change a photo’s EXIF metadata. To do this, I request a content editing input, get the full size image, modify its properties, create a content editing output, write the output image to the rendered content URL, then call performChanges on the PHPhotoLibrary creating an asset change request for that asset setting its content editing output. This works as expected for regular photos but Live Photos get turned off converted to a regular photo. To address this, I’m doing something similar by changing the properties of the .photo image in the Live Photo. I detect when the content editing input has a Live Photo, create a Live Photo editing context, set a frame processor that returns the frame’s image after setting its properties to the updated properties when the frame type is photo, then I create the content editing output and save the Live Photo to that output. It modifies the Live Photo successfully, but the metadata is not updated. If you get the full size image again the properties are the original properties. If you look at the EXIF metadata using an app like Metapho it remains unchanged. What am I doing wrong here? Thanks! let imageURL = contentEditingInput.fullSizeImageURL! let inputImage = CIImage(contentsOf: imageURL, options: [.applyOrientationProperty: true])! var metadata: [AnyHashable: Any] = inputImage.properties // Edit the metadata as desired... let editingContext = PHLivePhotoEditingContext(livePhotoEditingInput: contentEditingInput)! editingContext.frameProcessor = { frame, error -> CIImage? in // Edit only the still photo if frame.type == .photo { return frame.image.settingProperties(metadata) } return frame.image } let contentEditingOutput = try await withCheckedThrowingContinuation { continuation in let editingOutput = PHContentEditingOutput(contentEditingInput: contentEditingInput) editingOutput.adjustmentData = adjustmentData editingContext.saveLivePhoto(to: editingOutput) { success, error in if success { continuation.resume(returning: editingOutput) } else { continuation.resume(throwing: error!) } } } try await PHPhotoLibrary.shared().performChanges { let request = PHAssetChangeRequest(for: asset) request.contentEditingOutput = contentEditingOutput }
0
0
69
1d
How to edit gain map image to preserve HDR in Live Photo
I have an app that allows you to edit your photos. To preserve HDR, I edit both the SDR image and gain map image, like so: let sdrImage = CIImage(data: data, options: [.applyOrientationProperty: true]) let gainMapImage = CIImage(data: data, options: [.applyOrientationProperty: true, .auxiliaryHDRGainMap: true]) // edit them... try CIContext().writeHEIFRepresentation(of: sdrImage, to: url, format: .RGBA8, colorSpace: colorSpace, options: [.hdrGainMapImage: gainMapImage]) I also support editing the still photo in Live Photos. To do this you create a PHLivePhotoEditingContext, set the frameProcessor block which gives you a CIImage that I edit when the frame.type is .photo, then you create a PHContentEditingOutput and call saveLivePhoto. I’m not seeing any way to preserve HDR here. Interestingly the frame processor is called twice with .photo frame.type, but I don’t see any difference between these images. How can I edit a gain map image to preserve HDR in the still photo of a Live Photo?
1
0
204
3w
writeImageAtIndex:1012: ⭕️ ERROR: 'App' is trying to save an opaque image (5712x4284) with 'AlphaLast'.
I have an app that edits photos in your library. When I call try CIContext().writeHEIFRepresentation(of: editedImage, to: fileURL, format: .RGBA8, colorSpace: originalImage.colorSpace!) The following is logged to the console: writeImageAtIndex:1012: ⭕️ ERROR: 'App' is trying to save an opaque image (5712x4284) with 'AlphaLast'. This would unnecessarily increase the file size and will double (!!!) the required memory when decoding the image --> ignoring alpha. What does that mean and how can I resolve it? Xcode Version 16.0 (16A242d) iOS 18.1 (22B82)
3
6
423
3w
Get View Full HDR state from Settings > Photos to properly set preferredImageDynamicRange in editing extension
I'm updating my Photo Editing Extension to support HDR. To do this I set imageView.preferredImageDynamicRange = .high. But you can turn off the option to view HDR photos in the complete dynamic range in Settings > Photos. When you do that, open a photo, and tap the edit button, it does not appear in the full range as expected, but when you select my app from More > Extensions, it does appear in the complete dynamic range unexpectedly. I need to set imageView.preferredImageDynamicRange = .standard when View Full HDR is off, but I don't see any way to get that in my PHContentEditingController.
1
0
325
Oct ’24
What format for writeHEIFRepresentation preserves HDR?
In the WWDC 24 session "Use HDR for dynamic image experiences in your app" it's noted this is how you save edits for Adaptive HDR: SDR + HDR: writeHEIFRepresentation(of: sdrImage, to: url, colorSpace: p3Space, options: [.hdrImage: hdrImage]) SDR + Gain: writeHEIFRepresentation(of: sdrImage, to: url, colorSpace: p3Space, options: [.hdrGainMapImage: gainImage]) This won't compile because the format argument is missing. What format should be used? In the WWDC 23 session "Support HDR images in your app" RGBAf, RGBAh, and RGBA16, and RGB10 were mentioned but I'm not sure which one to use. If relevant, I'm editing photos from the user's photo library, so the image was probably taken on iPhone but perhaps not. Thanks!
1
0
267
Oct ’24
Test Swift Package that vends XCFramework and has dependencies in example app before distribution
I've created a closed source iOS SDK from a local Swift package, which has dependencies on other Swift packages, and successfully created a binary XCFramework following the solution from my previous post. I would now like to create a Package.swift to vend this XCFramework and test it in an example app to verify it works as expected before I upload it to a public repo for distribution. I understand that binaryTarget does not support dependencies so we need to use a wrapper. I created a directory containing the following: Package.swift MyFramework.xcframework/ MyFrameworkWrapper/ ├─ dummy.swift Package.swift contains: // swift-tools-version: 5.10 // The swift-tools-version declares the minimum version of Swift required to build this package. import PackageDescription let package = Package( name: "MyFramework", platforms: [ .iOS(.v14) ], products: [ .library( name: "MyFramework", targets: ["MyFramework", "MyFrameworkWrapper"] ) ], dependencies: [ .package(url: "https://github.com/gordontucker/FittedSheets.git", from: "2.6.1") ], targets: [ .target( name: "MyFrameworkWrapper", dependencies: [ "FittedSheets" ], path: "MyFrameworkWrapper" ), .binaryTarget( name: "MyFramework", path: "MyFramework.xcframework" ) ] ) I created a new iOS app, selected the project, Package Dependencies > + > Add Local, and added the directory containing this Package.swift. Xcode resolves the dependencies and lists them in the sidebar. I added code to import and use the framework. It builds successfully but the app crashes when run: dyld[63959]: Library not loaded: @rpath/FittedSheets.framework/FittedSheets Referenced from: <7DE247FC-DAFF-3946-AD21-E80F5AF841C9> /Users/Jordan/Library/Developer/Xcode/DerivedData/MyFramework-Example-gaeeymnqzenzrbbmhuebpodqctsz/Build/Products/Debug-iphonesimulator/MyFramework.framework/MyFramework How do I get this working? I'm wondering is my package set up properly to vend the framework specifying its dependencies, and is my XCFramework created correctly? The Package.swift for the framework's source code contains: // swift-tools-version: 5.10 // The swift-tools-version declares the minimum version of Swift required to build this package. import PackageDescription let package = Package( name: "MyFramework", platforms: [ .iOS(.v14) ], products: [ .library( name: "MyFramework", type: .dynamic, targets: ["MyFramework"] ) ], dependencies: [ .package(url: "https://github.com/gordontucker/FittedSheets.git", from: "2.6.1") ], targets: [ .target( name: "MyFramework", dependencies: [ "FittedSheets" ], path: "Sources" ) ] ) And I created the XCFramework following the steps in that previous thread: Create archive from package via xcodebuild archive -workspace "$PACKAGE_PATH" -scheme "$FRAMEWORK_NAME" -destination 'generic/platform=iOS' -archivePath "$ARCHIVE_PATH/iOS" SKIP_INSTALL=NO BUILD_LIBRARY_FOR_DISTRIBUTION=YES ENABLE_USER_SCRIPT_SANDBOXING=NO ENABLE_MODULE_VERIFIER=NO OTHER_SWIFT_FLAGS=-no-verify-emitted-module-interface Create the Modules directory in the framework via mkdir -p "$ARCHIVE_PATH/iOS.xcarchive/Products/usr/local/lib/$FRAMEWORK_NAME.framework/Modules" Copy the Swift interface files into the framework from the build in DerivedData via cp -a "$BUILD_PRODUCTS_PATH/Build/Intermediates.noindex/ArchiveIntermediates/$FRAMEWORK_NAME/BuildProductsPath/Release-iphoneos/$FRAMEWORK_NAME.swiftmodule" "$ARCHIVE_PATH/iOS.xcarchive/Products/usr/local/lib/$FRAMEWORK_NAME.framework/Modules" Repeat 1-3 for iOS Simulator Create an XCFramework via xcodebuild -create-xcframework -framework "$ARCHIVE_PATH/iOS.xcarchive/Products/usr/local/lib/$FRAMEWORK_NAME.framework" -framework "$ARCHIVE_PATH/iOS_Simulator.xcarchive/Products/usr/local/lib/$FRAMEWORK_NAME.framework" -output "$ARCHIVE_PATH/$FRAMEWORK_NAME.xcframework"
0
1
480
Jun ’24
Distribute XCFramework that has dependencies on Swift Packages with Example project
I've created a closed source iOS SDK from a local Swift package, which has dependencies on other Swift packages, and successfully created a binary XCFramework following the solution from my previous post. Now I'm proceeding with the process to distribute this SDK. I believe I want to upload the XCFramework to a public repo alongside a Package.swift file and an Example app project that uses the XCFramework. So each time I go to create a new release I’ll create a new XCFramework replacing the current one, verify it's working properly in the example app, then commit, tag, and push to the repo. My question is how do I set this up as a Swift package that includes an example app that uses the local XCFramework (not a remote url to a zip of the framework) and properly resolves dependencies? So far I created a directory containing MyFramework.xcframework and Package.swift containing: // swift-tools-version: 5.10 // The swift-tools-version declares the minimum version of Swift required to build this package. import PackageDescription let package = Package( name: "MyFramework", platforms: [ .iOS(.v14) ], products: [ .library( name: "MyFramework", targets: ["MyFramework"] ) ], dependencies: [ .package(url: "https://github.com/example/example.git", from: "1.0.0") ], targets: [ .binaryTarget( name: "MyFramework", path: "MyFramework.xcframework" ) ] ) I then created an Example iOS app project in that directory and now need to integrate the local XCFramework. I wondered if I could do that via File > Add Package Dependencies > Add Local, but when I navigate to that Package.swift and click Add Package it says The selected package cannot be a direct ancestor of the project. Do I need a different Package.swift for the Example app, and if so, how do I get that set up? I created a separate Package.swift (contents below) alongside the xcodeproj but when I try to add that in Xcode I get the same error, despite the fact this package is a sibling of the project not an ancestor. // swift-tools-version: 5.10 // The swift-tools-version declares the minimum version of Swift required to build this package. import PackageDescription let package = Package( name: "MyFramework-Example", platforms: [ .iOS(.v14) ], dependencies: [ .package(name: "MyFramework", path: "../") ], targets: [ .target( name: "MyFramework-Example", dependencies: ["MyFramework"] ) ] )
1
0
654
Jun ’24
Build XCFramework from source that has dependencies on Swift Packages
I’m looking into building a closed source XCFramework from a local Swift package that has dependencies on other packages, which can later be distributed via Swift Package Manager. In initial discussions, we thought xcodebuild does not support linking the dependencies externally, it always includes them statically in the built framework. It's my understanding this is because we're asking xcodebuild to build a framework from a local Swift Package. Is there another way this can be achieved? To explain in more detail: I have built a closed source SDK for other developers to integrate in their apps, currently distributed as an XCFramework. The interesting thing about the SDK is it has dependencies on other libraries, which need to be resolved when adding this SDK as a dependency to an app. The SDK’s dependencies should not be baked into our XCFramework. CocoaPods has worked well for that but we want to instead use SPM. The current project setup is an iOS framework Xcode project and an app Xcode workspace. The framework project is included in the app workspace and is in the same repo as the app, which allows me to modify the framework source code then run the app to test it. The framework project can also be opened independently and built to verify it doesn’t have any errors, but to verify it’s working I run it with the app. To distribute a new release I use xcodebuild to create an XCFramework and then deploy that. For this to work with CocoaPods I had to add a Podfile to the app directly as well as the framework directory so both have the dependencies available. This means I have an xcworkspace for the framework and not just a xcodeproj. I specify the framework workspace file in the xcodebuild command. To switch to a setup that utilizes Swift Package Manager, I created a Package.swift in the iOS framework project’s directory that specifies its dependencies, removed CocoaPods integration including deleting the workspace file, removed the framework project from the app’s workspace, added the Package as a local package to the app project, and added the framework directory via + > Add Files to “App” which adds the package to the top of the sidebar, making its source code available to edit within the app workspace. Everything is working when I run the app. Xcode properly resolves the dependencies for the local package and I can run the app to develop it. Now to create an XCFramework I run the following command in the framework directory (which contains the Package.swift): xcodebuild archive -workspace . -scheme FrameworkName -configuration Release -destination 'generic/platform=iOS' -archivePath './build/FrameworkName.framework-iphoneos.xcarchive' SKIP_INSTALL=NO BUILD_LIBRARIES_FOR_DISTRIBUTION=YES ENABLE_USER_SCRIPT_SANDBOXING=NO This succeeds however the dependencies have been linked statically thus included in our SDK. We need to only include the code from our framework and link to external dependencies, like it does with our current CocoaPods setup. I'm wondering what options there are to achieve this. Even if I need to change the project setup locally, for example to continue using a framework project/workspace instead of a local Swift package. It seems I just need xcodebuild to be able to create an XCFramework which can then be distributed with its own Package.swift file that specifies its dependencies. If it's not possible to link the dependencies externally, could you help me to understand the implications of including them statically? I don't know what problems could arise as a result of that or other concerns this would bring. Thanks!
1
1
1.3k
Jun ’24
Is it required to update photo metadata to set 1 for Orientation?
I've had an app that edits photos in your library since the PhotoKit API was released in iOS 8. I know it was required if you preserve photo metadata you had to change the value of Orientation to 1 (up), otherwise PhotoKit would fail to perform the asset change request. When I remove this code, I'm seeing Orientation is getting changed to 1 automatically both at root and in the TIFF dictionary (tested with iOS 18). I wanted to confirm this is expected behavior, the system does this for us now? If so, can I remove this code for iOS 15+, or was it a recent iOS version this started happening? Thanks!
0
0
407
Jun ’24
Sort user library assets by date captured instead of recently added
Is it possible to sort the user library assets by date captured? The Photos app in iOS 18 lets you choose between Date Captured and Recently Added and I want to offer that same choice in my app. This seems to always sort them by creation date (which I believe is the same as recently added): let assetCollection = PHAssetCollection.fetchAssetCollections(with: .smartAlbum, subtype: .smartAlbumUserLibrary, options: nil).firstObject! let fetchResult = PHAsset.fetchAssets(in: assetCollection, options: PHFetchOptions.imageMediaType())
0
1
421
Jun ’24
Get the creation date of an IntentFile
Is it possible to get the original date created for an IntentFile? The following code always gets the date for right now, surely because it's copied into a temporary directory so that's when it was created at that location. if let fileURL = file.fileURL, fileURL.startAccessingSecurityScopedResource() { if let attributes = try? FileManager.default.attributesOfItem(atPath: fileURL.path), let date = attributes[.creationDate] as? Date { print(date) } fileURL.stopAccessingSecurityScopedResource() }
0
0
476
Feb ’24
How to configure visionOS app icon in existing iOS project
I have an iOS app and I added Vision Pro as a supported destination. I'm ready to add an app icon. When I select my existing AppIcon there's no option to add visionOS assets to it. I went ahead and created a new visionOS App Icon titled VisionAppIcon. Now how do I configure the project to use VisionAppIcon for visionOS while continuing to use AppIcon for iOS? When I select the target and go to Build Settings there's Primary App Icon Set Name currently set to AppIcon. When I run the visionOS app, no app icon appears. If I change that to VisionAppIcon then it appears of course. But I don't see a way to add variants for it other than Debug and Release.
2
0
1.7k
Jan ’24