Post

Replies

Boosts

Views

Activity

Using San Francisco in an app
I'm developing an app that allows users to select a font and type text overtop images, similar to Skitch and others. To allow users to change the font, I have an array of font names which I display in table view cells. I provide a font preview by simply using UIFont(name:size:) and providing the name, for example "Helvetica Neue." Now with iOS 9 I want to add support for using San Francisco. The problem is, initializing a UIFont with "San Francisco" returns nil. In fact, it doesn't look like San Francisco is even included when printing out all available fonts.I could check if the app is running on iOS 9 and if so just include "San Francisco" in the array and use UIFont.systemFontOfSize when San Francisco is selected, but this is a really poor and dirty solution. How can I do this more appropriately?
4
0
5.8k
Jun ’15
Most performant method to fetch just images from a collection
It was discussed in What's New in Photos APIs at WWDC that we should avoid using NSPredicate for custom fetch options if at all possible for performance purposes. In my app, I'm using NSPredicate to get only images from the user's library. I'm not seeing an API that would allow me to get assets from a specific collection filtered to just images without the use of NSPredicate though. Is there a more efficient way to perform this query that I'm not seeing?let photoLibraryFetchResult = PHAssetCollection.fetchAssetCollections(with: .smartAlbum, subtype: .smartAlbumUserLibrary, options: nil) let assetCollection = photoLibraryFetchResult.firstObject! let fetchOptions = PHFetchOptions() fetchOptions.predicate = NSPredicate(format: "mediaType = %d", PHAssetMediaType.image.rawValue) let fetchResults = PHAsset.fetchAssets(in: assetCollection, options: fetchOptions)
3
0
2.3k
Jun ’17
Using SMJobBless in modern Xcode
I am trying to figure out how to programatically install a per-user launchd agent - I have an executable Swift script I wrote and I need macOS to enforce it always be running. I found the SMJobBless sample code which I could play with to see how this works, but it hasn't been updated since it was last built with Xcode 4.6. As you can imagine it doesn't compile in Xcode 10. I was able to get it to build by upgrading to the recommended project settings, increasing the deployment target, and selecting my team for the two targets. Following the ReadMe I need to run ./SMJobBlessUtil.py setreq to configure the Info.plists appropriately. These instructions are out of date but eskimo was kind enough to provide updated instructions here to find the .app url. But when I do this and run the command I receive the following output:MacBook:SMJobBless Jordan$ ./SMJobBlessUtil.py setreq /Users/Jordan/Library/Developer/Xcode/DerivedData/SMJobBless-dffakkidazmiowcishyrborysygm/Build/Products/Debug/SMJobBlessApp.app SMJobBlessApp/SMJobBlessApp-Info.plist SMJobBlessHelper/SMJobBlessHelper-Info.plist Traceback (most recent call last): File "./SMJobBlessUtil.py", line 424, in main() File "./SMJobBlessUtil.py", line 418, in main setreq(appArgs[1], appArgs[2], appArgs[3:]) File "./SMJobBlessUtil.py", line 360, in setreq appToolDict[bundleID] = toolNameToReqMap[bundleID] KeyError: '$(PRODUCT_BUNDLE_IDENTIFIER)'It would seem this python script isn't able to work with the newer project structures, not surprisingly. I wasn't able to find any other information on how to accomplish this task in the modern days. So could you please explain how to go about this? 🙂I have an executable .swift file and a .plist that works when loaded from ~/Library/LaunchAgents/ ready to be added to an existing Xcode project. Thanks!
3
0
2.9k
Jan ’19
pixelFormat (11) is not a valid MTLPixelFormat
I'm receiving this fatal exception in iOS 13 beta 1 when using my Core ML image classification modelMTLDebugValidateMTLPixelFormat, line 1388: error 'pixelFormat (11) is not a valid MTLPixelFormat.'MTLDebugValidateMTLPixelFormat:1388: failed assertion `pixelFormat (11) is not a valid MTLPixelFormat.' guard let drawingImage = canvasView.currentDrawing.rasterized()?.cgImage, let model = try? VNCoreMLModel(for: Symbols().model) else { return } let classificationRequest = VNCoreMLRequest(model: model) { [weak self] (request, error) in self?.processClassifications(for: request, error: error) } classificationRequest.imageCropAndScaleOption = .centerCrop let handler = VNImageRequestHandler(cgImage: drawingImage) try? handler.perform([classificationRequest]) //crashes here
2
0
2.4k
Jun ’19
PHAsset.location and CLLocation.reverseGeocode usage limitations
I would like to build a feature in my app that uses that requires knowing the name and/or address of locations where photos in their library were taken. It appears there's no Photos API to get details about a location for each photo, besides PHAsset.location which gives you latitude and longitude. I can reverse geocode that to get the name, address, etc. The problem is geocoding requests are rate limited. For my use-case, users select the photos they want (possibly hundreds), then tap a button to process those photos in a short amount of time. My questions: What is the limit, and is it per app or per user? If one person processes x number of photos and hits the limit, then can no one else in the world process photos or just that one person? If it's per person, I can limit how many photos they can process at once to be within the usage limits. Is there any solution you see to implement this and avoid hitting the limit? Is there any possibility to get more location info from the PHAsset so I don’t need to reverse geocode at all? The Photos app shows a location name for each photo in the nav bar but there's no API to get that. Thanks!
2
0
850
Jun ’20
Turi Create resource utilization
I'm looking into utilizing a cloud Linux machine to create and train my machine learning drawing classifier model with Turi Create. If I use my MacBook Pro it'll take at least 55 hours, so I want to offload that work and speed it up dramatically allowing me to iterate on the model to get the best result. Now there's a lot of options for RAM, CPU, and GPU capabilities. What will Turi Create take advantage of? For example, if I run it on a machine with 96 CPU cores will it utilize them all and possibly speed it up to minutes rather than days, or would a better GPU be preferred? How much RAM would be good? I'll be training it to recognize 6500 classes with 20 images of each. Thanks!
0
0
569
Jun ’20
How to add an app icon for your action extension?
The documentation doesn’t explain how. The Human Interface Guidelines states it should be a template image centered in about a 70pt asset. People on Stack Overflow are suggesting adding an AppIcon in the assets catalog, putting them in the 60pt iPhone App slots, and going to the Build Settings and setting Asset Catalog App Icon Set Name to AppIcon. That works… even on iPad, but it’s not 70x70 and you get warnings about not having 76, 83.5, and 1024pt app icons. :/ I filed FB7850720 Include Action Extension app icon asset and/or document how to add one
0
0
980
Jul ’20
Change volume of YouTube video playing in WKWebView
We are creating a watch party app that allows you to video chat with your friends and play a YouTube video at the same time. The video is played using Google's youtube-ios-player-helperlibrary which uses a WKWebView with their iframe API, as that's the only way to play it without violating the Terms of Service. We need the ability to change the volume of the YouTube video separately from the video chat, so you can hear your friends over the video for example. Unfortunately it's not possible to directly change the volume because iOS does not support changing the volume via JavaScript - https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/Using_HTML5_Audio_Video/Device-SpecificConsiderations/Device-SpecificConsiderations.html#//apple_ref/doc/uid/TP40009523-CH5-SW10, unlike macOS. Setting volume doesn't do anything and getting it always returns 1. Users can change the volume with the hardware buttons but this applies to all audio including the video chat, not just the YouTube video. Someone found a workaround - https://stackoverflow.com/a/37315071/1795356 to get the underlying AVPlayer and change its volume natively. This worked with UIWebView but does not work now that it uses WKWebView. What can be done to change the volume of the YouTube video?
1
0
1.8k
Sep ’20
Clarification on when paymentQueue(_:didRevokeEntitlementsForProductIdentifiers:) is called
Hello! The docs for paymentQueue(_:didRevokeEntitlementsForProductIdentifiers:) state: The system calls this delegate method whenever App Store revokes in-app purchases for a family member based on changes in Family Sharing ... StoreKit also calls this method when a purchaser receives a refund for a non-consumable or an auto-renewable subscription. Only the family members' devices receive this message. That last sentence is a bit confusing to me. If I'm reading that correctly, this means this method will not be called on the devices of the purchaser, but it will be called for the devices of the other family members. Is that true? I don’t understand why this would be called for family members but not the person who purchased it. Similarly, if Family Sharing is not enabled and someone purchases a non-consumable IAP, then requests a refund and Apple grants it, it sounds like this method is not called. So it seems it is not possible to detect when a customer has been refunded and revoke their access to the product (unless you have a server listening for refund notifications). Is that correct? In the wwdc20-10659 session, the presenter refunds a non-consumable purchase in the transaction manager and says "the transaction also remains in the app's receipt but is updated to contain a cancelation date for when the refund occurred." I believe this is only the case for subscriptions not non-consumables right? It should be removed from the receipt for non-consumables. He continues saying: "Again my app is informed about the refund and is able to respond immediately." This seems to contradict my understanding from the docs noted above. And in my testing, paymentQueue(_:didRevokeEntitlementsForProductIdentifiers:) is not called when I refund the purchase in the transaction manager. Is that expected behavior? It seems to me this method should be called in every refund scenario so you can revoke access, so if you can clarify this I'd really appreciate it. Thank you!
0
0
764
Dec ’20
Is it possible to get a song's bit rate from MPMediaItem?
I'd like to get a song's bit rate, for example 256 kbps, from a MPMediaItem retrieved via MPMediaPickerController. Is this possible? I tried to get it via: AVAsset(url: mediaItem.assetURL).tracks.first?.estimatedDataRate but this is 0 for most songs I've tried, and it's 127999 for a song that's really 64 kbps. I can get the sample rate of 44100 via: let trackDescription = AVAsset(url: url).tracks.first?.formatDescriptions.first let basicDescription = CMAudioFormatDescriptionGetStreamBasicDescription(trackDescription as! CMAudioFormatDescription)?.pointee let sampleRate = basicDescription.mSampleRate Supposedly one can calculate the bit rate given the sample rate, bit depth, and channels count, but I'm seeing mBitsPerChannel is always 0 in my testing.
0
0
714
Jun ’21
How to get "file" information from MPMediaItem
The Music macOS app shows various info about a song in the Get Info window. Most of this metadata is available in the iOS SDK via MPMediaItem. I'm wanting to access the information displayed in the File tab but I'm not seeing several pieces of data in the API. Is this possible? □ Kind - Apple Music AAC audio file - ? ☑︎ Duration - 3:00 - playbackDuration □ Size - 10 MB - ? □ Bit rate - 256 kbps - ? □ Sample rate - 44.100 kHz - ? □ Date modified - 1/1/2001 - ? ☑︎ Date added - 1/1/2001 - dateAdded □ Cloud status - Apple Music - ? ☑︎ Location - Cloud - isCloudItem
2
0
1.2k
Jun ’21
How to migrate NSPersistentCloudKitContainer to App Group to access in Widget
I have shipped an app that utilizes Core Data in CloudKit viaNSPersistentCloudKitContainer. I now want to add a widget that can query for the current data to display. It's my understanding you need to migrate this to a new location available to a shared App Group. How do you do this? container = NSPersistentCloudKitContainer(name: "AppName") container.loadPersistentStores { description, error in //handle error } container.viewContext.mergePolicy = NSMergeByPropertyObjectTrumpMergePolicy container.viewContext.automaticallyMergesChangesFromParent = true
1
1
1.2k
Jun ’21