Posts

Post not yet marked as solved
1 Replies
1.3k Views
Is it ok to call requestContentEditingInput for a lot of PHAssets to get URLs for their full size image? It seems odd because I would not be using the content editing input to actually modify these images. Is that ok are or are there implications to be aware of? Use case: I want to allow the user to share multiple PHAssets via UIActivityViewController. I can download and share an array of UIImage, which works, but I found if you tap Copy the app freezes for like 1 second for each photo (10 seconds if you shared 10 photos). Profiling the app it looks like iOS is spending the time creating a PNG for each image. Also it's probably not a good idea to store huge images in memory like that. I figured I'd try sharing an array of URLs to the images. Seemingly the only way you can get a URL for a photo is by requesting a content editing input for the asset and accessing its fullSizeImageURL property. Is this a good idea, and is this the right approach to share PHAssets?
Posted
by Jordan.
Last updated
.
Post not yet marked as solved
3 Replies
2.6k Views
I am trying to figure out how to programatically install a per-user launchd agent - I have an executable Swift script I wrote and I need macOS to enforce it always be running. I found the SMJobBless sample code which I could play with to see how this works, but it hasn't been updated since it was last built with Xcode 4.6. As you can imagine it doesn't compile in Xcode 10. I was able to get it to build by upgrading to the recommended project settings, increasing the deployment target, and selecting my team for the two targets. Following the ReadMe I need to run ./SMJobBlessUtil.py setreq to configure the Info.plists appropriately. These instructions are out of date but eskimo was kind enough to provide updated instructions here to find the .app url. But when I do this and run the command I receive the following output:MacBook:SMJobBless Jordan$ ./SMJobBlessUtil.py setreq /Users/Jordan/Library/Developer/Xcode/DerivedData/SMJobBless-dffakkidazmiowcishyrborysygm/Build/Products/Debug/SMJobBlessApp.app SMJobBlessApp/SMJobBlessApp-Info.plist SMJobBlessHelper/SMJobBlessHelper-Info.plist Traceback (most recent call last): File "./SMJobBlessUtil.py", line 424, in main() File "./SMJobBlessUtil.py", line 418, in main setreq(appArgs[1], appArgs[2], appArgs[3:]) File "./SMJobBlessUtil.py", line 360, in setreq appToolDict[bundleID] = toolNameToReqMap[bundleID] KeyError: '$(PRODUCT_BUNDLE_IDENTIFIER)'It would seem this python script isn't able to work with the newer project structures, not surprisingly. I wasn't able to find any other information on how to accomplish this task in the modern days. So could you please explain how to go about this? 🙂I have an executable .swift file and a .plist that works when loaded from ~/Library/LaunchAgents/ ready to be added to an existing Xcode project. Thanks!
Posted
by Jordan.
Last updated
.
Post not yet marked as solved
2 Replies
2.1k Views
It was discussed in What's New in Photos APIs at WWDC that we should avoid using NSPredicate for custom fetch options if at all possible for performance purposes. In my app, I'm using NSPredicate to get only images from the user's library. I'm not seeing an API that would allow me to get assets from a specific collection filtered to just images without the use of NSPredicate though. Is there a more efficient way to perform this query that I'm not seeing?let photoLibraryFetchResult = PHAssetCollection.fetchAssetCollections(with: .smartAlbum, subtype: .smartAlbumUserLibrary, options: nil) let assetCollection = photoLibraryFetchResult.firstObject! let fetchOptions = PHFetchOptions() fetchOptions.predicate = NSPredicate(format: "mediaType = %d", PHAssetMediaType.image.rawValue) let fetchResults = PHAsset.fetchAssets(in: assetCollection, options: fetchOptions)
Posted
by Jordan.
Last updated
.
Post not yet marked as solved
2 Replies
1.4k Views
Is it possible for an app using NSPersistentCloudKitContainer to enable sync in the background, if so, how? Install app on your iPhone and iPad, create some data, it automatically syncs to both and life is good Close the iPad app Modify the data on iPhone Desired behavior: The backgrounded iPad app should sync (even if it takes a while) and be informed that its local database has finished syncing or similarly that changes were made. The use case is I want to reload my widget when data changes so it's up-to-date, so I need my app to sync it in the background, then notify when it's complete to be able to trigger the widget reload. I am concerned it will be a poor widget experience if it's always showing stale data until they manually open the app to initiate sync - kind of defeats the purpose of widgets. ha According to this post, they found sync is never run in the background. Is this not the case, or has it changed in iOS 15? Thanks!
Posted
by Jordan.
Last updated
.
Post marked as solved
1 Replies
2.1k Views
I have a UIViewController that initially does not display any scrollable content, but later on I add a child view controller that does scroll - a UIHostingController whose rootView is a GeometryReader containing a ScrollView. The problem is when you scroll the UINavigationBar remains transparent I’m sure because it couldn’t find a scroll view in the view hierarchy. There is an API to specify which scroll view to use but that’s a UIScrollView. How can I tell it about my SwiftUI scroll view? viewController.setContentScrollView(scrollView, for: .bottom)
Posted
by Jordan.
Last updated
.
Post marked as solved
1 Replies
1k Views
I have shipped an app that utilizes Core Data in CloudKit viaNSPersistentCloudKitContainer. I now want to add a widget that can query for the current data to display. It's my understanding you need to migrate this to a new location available to a shared App Group. How do you do this? container = NSPersistentCloudKitContainer(name: "AppName") container.loadPersistentStores { description, error in //handle error } container.viewContext.mergePolicy = NSMergeByPropertyObjectTrumpMergePolicy container.viewContext.automaticallyMergesChangesFromParent = true
Posted
by Jordan.
Last updated
.
Post not yet marked as solved
2 Replies
1.1k Views
The Music macOS app shows various info about a song in the Get Info window. Most of this metadata is available in the iOS SDK via MPMediaItem. I'm wanting to access the information displayed in the File tab but I'm not seeing several pieces of data in the API. Is this possible? □ Kind - Apple Music AAC audio file - ? ☑︎ Duration - 3:00 - playbackDuration □ Size - 10 MB - ? □ Bit rate - 256 kbps - ? □ Sample rate - 44.100 kHz - ? □ Date modified - 1/1/2001 - ? ☑︎ Date added - 1/1/2001 - dateAdded □ Cloud status - Apple Music - ? ☑︎ Location - Cloud - isCloudItem
Posted
by Jordan.
Last updated
.
Post not yet marked as solved
3 Replies
1.5k Views
Given an Apple Music trackId is it possible to query the user’s media library to see if they’ve added it to their library? Something like: let predicate = MPMediaPropertyPredicate(value: "1440818675", forProperty: MPMediaItemPropertyPersistentID) let query = MPMediaQuery(filterPredicates: Set([predicate])) let songs = query.items let isInLibrary = !songs.isEmpty
Posted
by Jordan.
Last updated
.
Post not yet marked as solved
0 Replies
623 Views
I'd like to get a song's bit rate, for example 256 kbps, from a MPMediaItem retrieved via MPMediaPickerController. Is this possible? I tried to get it via: AVAsset(url: mediaItem.assetURL).tracks.first?.estimatedDataRate but this is 0 for most songs I've tried, and it's 127999 for a song that's really 64 kbps. I can get the sample rate of 44100 via: let trackDescription = AVAsset(url: url).tracks.first?.formatDescriptions.first let basicDescription = CMAudioFormatDescriptionGetStreamBasicDescription(trackDescription as! CMAudioFormatDescription)?.pointee let sampleRate = basicDescription.mSampleRate Supposedly one can calculate the bit rate given the sample rate, bit depth, and channels count, but I'm seeing mBitsPerChannel is always 0 in my testing.
Posted
by Jordan.
Last updated
.
Post not yet marked as solved
4 Replies
5.6k Views
I'm developing an app that allows users to select a font and type text overtop images, similar to Skitch and others. To allow users to change the font, I have an array of font names which I display in table view cells. I provide a font preview by simply using UIFont(name:size:) and providing the name, for example "Helvetica Neue." Now with iOS 9 I want to add support for using San Francisco. The problem is, initializing a UIFont with "San Francisco" returns nil. In fact, it doesn't look like San Francisco is even included when printing out all available fonts.I could check if the app is running on iOS 9 and if so just include "San Francisco" in the array and use UIFont.systemFontOfSize when San Francisco is selected, but this is a really poor and dirty solution. How can I do this more appropriately?
Posted
by Jordan.
Last updated
.
Post not yet marked as solved
0 Replies
707 Views
Hello! The docs for paymentQueue(_:didRevokeEntitlementsForProductIdentifiers:) state: The system calls this delegate method whenever App Store revokes in-app purchases for a family member based on changes in Family Sharing ... StoreKit also calls this method when a purchaser receives a refund for a non-consumable or an auto-renewable subscription. Only the family members' devices receive this message. That last sentence is a bit confusing to me. If I'm reading that correctly, this means this method will not be called on the devices of the purchaser, but it will be called for the devices of the other family members. Is that true? I don’t understand why this would be called for family members but not the person who purchased it. Similarly, if Family Sharing is not enabled and someone purchases a non-consumable IAP, then requests a refund and Apple grants it, it sounds like this method is not called. So it seems it is not possible to detect when a customer has been refunded and revoke their access to the product (unless you have a server listening for refund notifications). Is that correct? In the wwdc20-10659 session, the presenter refunds a non-consumable purchase in the transaction manager and says "the transaction also remains in the app's receipt but is updated to contain a cancelation date for when the refund occurred." I believe this is only the case for subscriptions not non-consumables right? It should be removed from the receipt for non-consumables. He continues saying: "Again my app is informed about the refund and is able to respond immediately." This seems to contradict my understanding from the docs noted above. And in my testing, paymentQueue(_:didRevokeEntitlementsForProductIdentifiers:) is not called when I refund the purchase in the transaction manager. Is that expected behavior? It seems to me this method should be called in every refund scenario so you can revoke access, so if you can clarify this I'd really appreciate it. Thank you!
Posted
by Jordan.
Last updated
.
Post not yet marked as solved
0 Replies
881 Views
The documentation doesn’t explain how. The Human Interface Guidelines states it should be a template image centered in about a 70pt asset. People on Stack Overflow are suggesting adding an AppIcon in the assets catalog, putting them in the 60pt iPhone App slots, and going to the Build Settings and setting Asset Catalog App Icon Set Name to AppIcon. That works… even on iPad, but it’s not 70x70 and you get warnings about not having 76, 83.5, and 1024pt app icons. :/ I filed FB7850720 Include Action Extension app icon asset and/or document how to add one
Posted
by Jordan.
Last updated
.
Post marked as solved
2 Replies
685 Views
I would like to build a feature in my app that uses that requires knowing the name and/or address of locations where photos in their library were taken. It appears there's no Photos API to get details about a location for each photo, besides PHAsset.location which gives you latitude and longitude. I can reverse geocode that to get the name, address, etc. The problem is geocoding requests are rate limited. For my use-case, users select the photos they want (possibly hundreds), then tap a button to process those photos in a short amount of time. My questions: What is the limit, and is it per app or per user? If one person processes x number of photos and hits the limit, then can no one else in the world process photos or just that one person? If it's per person, I can limit how many photos they can process at once to be within the usage limits. Is there any solution you see to implement this and avoid hitting the limit? Is there any possibility to get more location info from the PHAsset so I don’t need to reverse geocode at all? The Photos app shows a location name for each photo in the nav bar but there's no API to get that. Thanks!
Posted
by Jordan.
Last updated
.
Post not yet marked as solved
0 Replies
497 Views
I'm looking into utilizing a cloud Linux machine to create and train my machine learning drawing classifier model with Turi Create. If I use my MacBook Pro it'll take at least 55 hours, so I want to offload that work and speed it up dramatically allowing me to iterate on the model to get the best result. Now there's a lot of options for RAM, CPU, and GPU capabilities. What will Turi Create take advantage of? For example, if I run it on a machine with 96 CPU cores will it utilize them all and possibly speed it up to minutes rather than days, or would a better GPU be preferred? How much RAM would be good? I'll be training it to recognize 6500 classes with 20 images of each. Thanks!
Posted
by Jordan.
Last updated
.