Posts

Post not yet marked as solved
0 Replies
214 Views
Hi. I am measuring performances of methods in unit test on Xcode. Measuring performances on release build is important since you can know if a performance issue will be solved by compiler or not. However, when you specify its build as release, you cannot use @testable. Please tell me any manners. Thanks.
Posted
by y_ich.
Last updated
.
Post not yet marked as solved
4 Replies
282 Views
Hi. I want to implement the code below using vDSP. for i in a.indices { a[i] = n[i] == 0.0 ? 0.0 : b[i] / n[i] } This code is slow. Are there any good implementation using Accelerate framework?
Posted
by y_ich.
Last updated
.
Post not yet marked as solved
0 Replies
701 Views
Hi. A17 Pro Neural Engine has 35 TOPS computational power. But many third-party benchmarks and articles suggest that it has a little more power than A16 Bionic. Some references are, Geekbench ML Core ML performance benchmark, 2023 edition How do we use the maximum power of A17 Pro Neural Engine? For example, I guess that logical devices of ANE on A17 Pro may be two, not one, so we may need to instantiate two Core ML models simultaneously for the purpose. Please let me know any technical hints.
Posted
by y_ich.
Last updated
.
Post not yet marked as solved
1 Replies
273 Views
Hi. I want to implement a template selection such as Pages and Numbers. Currently I am using DocumentGroup scene on SwiftUI. How can I implement it? init( newDocument: @autoclosure @escaping () -> Document, @ViewBuilder editor: @escaping (FileDocumentConfiguration<Document>) -> Content ) The initializer of DocumentGroup may suggest that the newDocument argument should open template selector and return one when the selector is closed. But I think that it may beome a complicated implementation. What is a right way to implement the template selector?
Posted
by y_ich.
Last updated
.
Post not yet marked as solved
3 Replies
2.0k Views
Hi. I implemented a broadcast upload extension and it requests local notifications. The local notification works normally on broadcastStarted(withSetupInfo:), but the Banner of the local notification does not work on processSampleBuffer(_: with:) though its Notification Center works normally. What am I missing? Here is my code snippets. container app class AppDelegate: NSObject, UIApplicationDelegate { func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool { // Override point for customization after application launch. requestAuthorization() ... } private func requestAuthorization() { let center = UNUserNotificationCenter.current() center.requestAuthorization(options: [.alert]) { granted, error in if let error = error { // Handle the error here. print(error) } if granted == true { center.delegate = self center.getNotificationSettings(completionHandler: { setting in print(setting) }) } else { print("not permitted") } } } } upload extension class SampleHandler: RPBroadcastSampleHandler { override func broadcastStarted(withSetupInfo setupInfo: [String : NSObject]?) { super.broadcastStarted(withSetupInfo: setupInfo) notification(title: "Upload Extension", body: "broadcastStarted") ... } override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) { super.processSampleBuffer(sampleBuffer, with: sampleBufferType) ... if some condition { notification(title: "Upload Extension", body: "processSampleBuffer") } } private func notification(title: String, body: String) { let content = UNMutableNotificationContent() content.title = title content.body = body let request = UNNotificationRequest(identifier: UUID().uuidString, content: content, trigger: nil) let notificationCenter = UNUserNotificationCenter.current() notificationCenter.add(request) { error in if error != nil { print(error) } } } }
Posted
by y_ich.
Last updated
.
Post not yet marked as solved
2 Replies
390 Views
Hi. I know that UIApplication#keyWindow is deprecated since iOS/iPadOS 13.0. But now (iOS/iPadOS 16.5), UIWindow#isKeyWindow and didBecomeKeyNotification seem deprecated, too. isKeyWindow is always true and didBecomeKeyNotification is never triggered. Is my understanding right? And if so, since what version of iOS/iPadOS are they deprecated?
Posted
by y_ich.
Last updated
.
Post not yet marked as solved
2 Replies
1.4k Views
Hi. Excuse me for no reprodcution code. My app on macOS Monterey runs MPSGraph#run repeatedly. For a minutes, Xcode console shows "Context leak detected, CoreAnalytics returned false" repeatedly and the system slows down. Do I need to release some resource for each calling of run method? Thanks
Posted
by y_ich.
Last updated
.
Post marked as solved
1 Replies
664 Views
Hi. I am implementing some neural network model by MPSGraph on Radeon Mac. I want to accelerate it by float16 since Radeon can execute kernels with float16 twice faster than float32. Is it possible? I mean, does MPSGraph support native float16 on Radeon GPU? If so, how can I do it? Setting all datatypes to float16? Thanks.
Posted
by y_ich.
Last updated
.
Post not yet marked as solved
0 Replies
617 Views
Hi. I want my iPad app to run Intel/M1 Mac. I tried Mac Catalyst and it seems work basically. But I want the app to enable to close a window even if it has opened a modal. NSWindow has "preventsApplicationTerminationWhenModal" property. Are there any alternatives for Mac Catalyst or iPad app on M1 Mac? Thanks.
Posted
by y_ich.
Last updated
.
Post not yet marked as solved
3 Replies
1.1k Views
Hi.I tried to use "live listen" by my beats headset but failed. It seems that "live listen" feature is enabled only for AirPods (Pro) or "made for iPhone" hearing aids.And I also tried to switch an input device to builtInMic by setPreferredInput on Swift Playgrounds.It succeeded but it also switches the output to built in speaker.Since I want to keep the output to the headset, I failed to make my own "live listen".Is "live listen" setting an private API?Or is it possible to make my own "live listen" app if I have AirPods?Thanks.
Posted
by y_ich.
Last updated
.
Post not yet marked as solved
0 Replies
462 Views
Hi. In the sample code "Offering, Completing, and Restoring In-App Purchases "(https://developer.apple.com/documentation/storekit/in-app_purchase/offering_completing_and_restoring_in-app_purchases), there is a cleanup code in applicationWillTerminate method.func applicationWillTerminate(_ application: UIApplication) { // Remove the observer. SKPaymentQueue.default().remove(StoreObserver.shared) } But applicationWillTerminate method is not called when the app is suspended according to the document (https://developer.apple.com/documentation/uikit/uiapplicationdelegate/1623111-applicationwillterminate). What would happen about StoreObserver.shared if the suspended app is terminated? Thanks.
Posted
by y_ich.
Last updated
.
Post not yet marked as solved
0 Replies
475 Views
Hi. When Catalyst was released, I understood that it consisted of two factors, compiling for Intel CPU and UIKit on macOS. So you needs to add target "Mac" and Mac.entitlements on Xcode to run your app on Catalyst. Now M1 Mac appears and it runs iPhone/iPad apps without the above compilation if they are distributed. In this case, still do iPhone/iPad apps run on Catalyst? (I mean that macOS-specific features such as UIHoverGestureRecognizer work on M1 Mac if you implement some using these features.) 2. Why don't you need Mac.entitlements? The above two questions are just what I come up with. Detail summary is welcome. Thanks
Posted
by y_ich.
Last updated
.