I would like to get an idea whether anyone is actually using Multipeer in a recent production iOS app.
I’ve burned several days worth of my client’s money getting Multipeer set up, but the invitation system just isn’t anywhere near reliable enough to ship. The use cases is trivial, it’s just a chat room.
I assumed I was doing something wrong, so I went looking for examples on Github and my code looked much the same as everyone else’s. Eventually I found the MultipeerGroupChat sample app (https://developer.apple.com/library/archive/samplecode/MultipeerGroupChat/Introduction/Intro.html#//apple_ref/doc/uid/DTS40013691), and after trying it on 3 devices I confirmed that the invite flow is broken for it too.
Although the sample app uses Apple’s UI helpers (MCBrowserViewController and MCAdvertiserAssistant), and my app has a custom UI and uses the lower-level MCNearbyServiceAdvertiser and MCNearbyServiceBrowser, I can see the underlying issues are the same.
Here’s one example which fails reliably for me:
Boot the sample on 3 devices
Open the browser on one device and invite the others
All 3 are connected and can send messages successfully
Go home/lock one device and wait such that the others show it as disconnected; or you can force quit it
Bring the app back up on that device
Now, try and reconnect it to the other two
No matter which device(s) you try to send the invites from (either the one that returned, or one of the others), you’ll get in a mess
The invite is (usually) delivered, but then one or both devices involved will get stuck in the 'connecting' state
There’s no way out, except to disconnect the session and start again
I genuinely want to know whether this framework is in use in real apps and there are workarounds to the reliability issues, or whether it’s just something with a bunch of tutorials online that nobody really uses.
Post
Replies
Boosts
Views
Activity
My app is using RemoteIO to record audio. It doesn’t do any playback. RemoteIO seems to be broadly compatible with the new Sound Recognition feature in iOS14, but I’m seeing a glitch when sound recognition is first enabled.
If my app is started and I initialise RemoteIO, and then turn on Sound Recognition (say via control centre), the RemoteIO input callback is not called thereafter, until I tear down the audio unit and set it back up. So something like the following:
Launch app
RemoteIO is initialised and working, can record
Turn on Sound Recognition via Settings or control centre widget
Start recording with already-set up RemoteIO
Recording callback is never again called
Though no input callbacks are seen, kAudioOutputUnitProperty_IsRunning is reported as true, so the audio unit thinks it is active
Tear down audio unit
Set up audio unit again
Recording works
Buffer size is changed, reflecting some effect on the audio session of the Sound Recognition feature
I also noticed that when Sound Recognition is enabled, I see several (usually 3) AVAudioSession.routeChangeNotifications in quick succession. When Sound Recognition is disabled while RemoteIO is set up, I don’t see this problem. I’m allocating my own buffers so it’s not a problem with their size.
What could be going on here? Am I not handling a route change properly? There doesn’t seem to be a reliable sequence of events I can catch to know when to reset the audio unit.
The only fix I’ve found here is to hack in a timer that checks for callback activity shortly after starting recording, and resets the audio unit if no callback activity is seen. Better than nothing, but not super reliable.
I’m getting the following crash when using AVAudioSession with AVAudioEngine. What I don’t understand is why InterruptionListener is listed twice in the stack trace. Does this mean it’s somehow being called again before it has returned? Is this likely to be a concurrency issue?
Crashed: AVAudioSession Notify Thread
0 libEmbeddedSystemAUs.dylib 0x1dbc3333c InterruptionListener(void*, unsigned int, unsigned int, void const*)
1 libEmbeddedSystemAUs.dylib 0x1dbc33270 InterruptionListener(void*, unsigned int, unsigned int, void const*)
2 AudioToolbox 0x1c86e6484 AudioSessionPropertyListeners::CallPropertyListeners(unsigned int, unsigned int, void const*) + 596
3 AudioToolbox 0x1c8740798 HandleAudioSessionCFTypePropertyChangedMessage(unsigned int, unsigned int, void*, unsigned int) + 1144
4 AudioToolbox 0x1c873fec0 ProcessDeferredMessage(unsigned int, __CFData const*, unsigned int, unsigned int) + 2452
5 AudioToolbox 0x1c873f17c ASCallbackReceiver_AudioSessionPingMessage + 632
6 AudioToolbox 0x1c87ad398 _XAudioSessionPingMessage + 44
7 libAudioToolboxUtility.dylib 0x1c8840430 mshMIGPerform + 264
8 CoreFoundation 0x1bd42b174 __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE1_PERFORM_FUNCTION__ + 56
9 CoreFoundation 0x1bd42a880 __CFRunLoopDoSource1 + 444
10 CoreFoundation 0x1bd425634 __CFRunLoopRun + 1888
11 CoreFoundation 0x1bd424ba8 CFRunLoopRunSpecific + 424
12 AVFAudio 0x1ca1f4a2c GenericRunLoopThread::Entry(void*) + 156
13 AVFAudio 0x1ca2457a0 CAPThread::Entry(CAPThread*) + 204
14 libsystem_pthread.dylib 0x1bd1c2d98 _pthread_start + 156
15 libsystem_pthread.dylib 0x1bd1c674c thread_start + 8
I’m getting a flood of these errors in a shipping speech recognition app since users started upgrading to iOS15. It’s usually being returned by the speech recogniser a few seconds after recognition begins.
I can’t find any reference to it anywhere in Apple’s documentation. What is it?
Code: 301
Domain: kLSRErrorDomain
Description: Recognition request was canceled
I’m using AVAudioEngine to get a stream of AVAudioPCMBuffers from the device’s microphone using the usual installTap(onBus:) setup.
To distribute the audio stream to other parts of the program, I’m sending the buffers to a Combine publisher similar to the following:
private let publisher = PassthroughSubject<AVAudioPCMBuffer, Never>()
I’m starting to suspect I have some kind of concurrency or memory management issue with the buffers, because when consuming the buffers elsewhere I’m getting a range of crashes that suggest some internal pointer in a buffer is NULL (specifically, I’m seeing crashes in vDSP.convertElements(of:to:) when I try to read samples from the buffer).
These crashes are in production and fairly rare — I can’t reproduce them locally.
I never modify the audio buffers, only read them for analysis.
My question is: should it be possible to put AVAudioPCMBuffers into a Combine pipeline? Does the AVAudioPCMBuffer class not retain/release the underlying AudioBufferList’s memory the way I’m assuming? Is this a fundamentally flawed approach?
I’m seeing a crash that looks to be related to TaskGroup from Swift Concurrency. I can’t catch it in the debugger, only when running on device. It looks like the crash is actually in the Swift Concurrency library, and I’m not sure how to trace from there back to something in my own code...
Some digging so far suggests the issue only happens:
on iOS 15.0->15.3.x (ie. it looks to not happen on 15.4)
on older iPhones such as 6s, 6s plus, 7, 8, X
Where do I start trying to figure out the root cause of this?
Crash Report
I’m seeing a crash in production for a small percentage of users, and have narrowed it down based on logging to happening as or very shortly after an alert is presented using SwiftUI.
This seems to be isolated to iOS 17.5.1, but since it’s a low-volume crash I can’t be sure there aren’t other affected versions. What can I understand from the crash report?
Here’s a simplified version of the code which presents the alert, which seems so simple I can’t understand why it would crash. And following that is the crash trace.
// View (simplified)
@MainActor public struct MyView: View {
@ObservedObject var model: MyViewModel
public init(model: MyViewModel) {
self.model = model
}
public var body: some View {
myViewContent
.overlay(clearAlert)
}
var clearAlert: some View {
EmptyView().alert(
"Are You Sure?",
isPresented: $model.isClearAlertVisible,
actions: {
Button("Keep", role: .cancel) { model.clearAlertKeepButtonWasPressed() }
Button("Delete", role: .destructive) { model.clearAlertDeleteButtonWasPressed() }
},
message: {
Text("This cannot be undone.")
}
)
}
}
// Model (simplified)
@MainActor public final class MyViewModel: ObservableObject {
@Published var isClearAlertVisible = false
func clearButtonWasPressed() {
isClearAlertVisible = true
}
func clearAlertKeepButtonWasPressed() {
// No-op.
}
func clearAlertDeleteButtonWasPressed() {
// Calls other code.
}
}
Incident Identifier: 36D05FF3-C64E-4327-8589-D8951C8BAFC4
Distributor ID: com.apple.AppStore
Hardware Model: iPhone13,2
Process: My App [379]
Path: /private/var/containers/Bundle/Application/B589E780-96B2-4A5F-8FCD-8B34F2024595/My App.app/My App
Identifier: com.me.MyApp
Version: 1.0 (1)
AppStoreTools: 15F31e
AppVariant: 1:iPhone13,2:15
Code Type: ARM-64 (Native)
Role: Foreground
Parent Process: launchd [1]
Coalition: com.me.MyApp [583]
Date/Time: 2024-06-21 20:09:20.9767 -0500
Launch Time: 2024-06-20 18:41:01.7542 -0500
OS Version: iPhone OS 17.5.1 (21F90)
Release Type: User
Baseband Version: 4.50.06
Report Version: 104
Exception Type: EXC_BREAKPOINT (SIGTRAP)
Exception Codes: 0x0000000000000001, 0x00000001a69998c0
Termination Reason: SIGNAL 5 Trace/BPT trap: 5
Terminating Process: exc handler [379]
Triggered by Thread: 0
Kernel Triage:
VM - (arg = 0x3) mach_vm_allocate_kernel failed within call to vm_map_enter
VM - (arg = 0x3) mach_vm_allocate_kernel failed within call to vm_map_enter
VM - (arg = 0x3) mach_vm_allocate_kernel failed within call to vm_map_enter
VM - (arg = 0x3) mach_vm_allocate_kernel failed within call to vm_map_enter
VM - (arg = 0x3) mach_vm_allocate_kernel failed within call to vm_map_enter
Thread 0 name:
Thread 0 Crashed:
0 libswiftCore.dylib 0x00000001a69998c0 _assertionFailure(_:_:file:line:flags:) + 264 (AssertCommon.swift:144)
1 AttributeGraph 0x00000001d0cd61a4 Attribute.init<A>(body:value:flags:update:) + 352 (Attribute.swift:473)
2 SwiftUI 0x00000001ac034054 closure #1 in Attribute.init<A>(_:) + 128 (<compiler-generated>:0)
3 SwiftUI 0x00000001ac033cac partial apply for closure #1 in Attribute.init<A>(_:) + 32 (<compiler-generated>:0)
4 libswiftCore.dylib 0x00000001a6ad0450 withUnsafePointer<A, B>(to:_:) + 28 (LifetimeManager.swift:128)
5 SwiftUI 0x00000001ad624d14 closure #2 in UIKitDialogBridge.startTrackingUpdates(actions:) + 268 (UIKitDialogBridge.swift:370)
6 SwiftUI 0x00000001ad624ae0 UIKitDialogBridge.startTrackingUpdates(actions:) + 248 (UIKitDialogBridge.swift:369)
7 SwiftUI 0x00000001ad6250cc closure #4 in UIKitDialogBridge.showNewAlert(_:id:) + 72 (UIKitDialogBridge.swift:471)
8 SwiftUI 0x00000001abfdd050 thunk for @escaping @callee_guaranteed () -> () + 36 (:-1)
9 UIKitCore 0x00000001aa5722e4 -[UIPresentationController transitionDidFinish:] + 1096 (UIPresentationController.m:651)
10 UIKitCore 0x00000001aa571d88 __56-[UIPresentationController runTransitionForCurrentState]_block_invoke.114 + 320 (UIPresentationController.m:1390)
11 UIKitCore 0x00000001aa5cb9ac -[_UIViewControllerTransitionContext completeTransition:] + 116 (UIViewControllerTransitioning.m:304)
12 UIKitCore 0x00000001aa34a91c __UIVIEW_IS_EXECUTING_ANIMATION_COMPLETION_BLOCK__ + 36 (UIView.m:16396)
13 UIKitCore 0x00000001aa34a800 -[UIViewAnimationBlockDelegate _didEndBlockAnimation:finished:context:] + 624 (UIView.m:16429)
14 UIKitCore 0x00000001aa349518 -[UIViewAnimationState sendDelegateAnimationDidStop:finished:] + 436 (UIView.m:0)
15 UIKitCore 0x00000001aa356b14 -[UIViewAnimationState animationDidStop:finished:] + 192 (UIView.m:2400)
16 UIKitCore 0x00000001aa356b84 -[UIViewAnimationState animationDidStop:finished:] + 304 (UIView.m:2422)
17 QuartzCore 0x00000001a96f8c50 run_animation_callbacks(void*) + 132 (CALayer.mm:7714)
18 libdispatch.dylib 0x00000001aff61dd4 _dispatch_client_callout + 20 (object.m:576)
19 libdispatch.dylib 0x00000001aff705a4 _dispatch_main_queue_drain + 988 (queue.c:7898)
20 libdispatch.dylib 0x00000001aff701b8 _dispatch_main_queue_callback_4CF + 44 (queue.c:8058)
21 CoreFoundation 0x00000001a808f710 __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 16 (CFRunLoop.c:1780)
22 CoreFoundation 0x00000001a808c914 __CFRunLoopRun + 1996 (CFRunLoop.c:3149)
23 CoreFoundation 0x00000001a808bcd8 CFRunLoopRunSpecific + 608 (CFRunLoop.c:3420)
24 GraphicsServices 0x00000001ecf3c1a8 GSEventRunModal + 164 (GSEvent.c:2196)
25 UIKitCore 0x00000001aa6c490c -[UIApplication _run] + 888 (UIApplication.m:3713)
26 UIKitCore 0x00000001aa7789d0 UIApplicationMain + 340 (UIApplication.m:5303)
27 SwiftUI 0x00000001ac27c148 closure #1 in KitRendererCommon(_:) + 168 (UIKitApp.swift:51)
28 SwiftUI 0x00000001ac228714 runApp<A>(_:) + 152 (UIKitApp.swift:14)
29 SwiftUI 0x00000001ac2344d0 static App.main() + 132 (App.swift:114)
30 My App 0x00000001001e7bfc static MyApp.$main() + 52 (MyApp.swift:0)
31 My App 0x00000001001e7bfc main + 64
32 dyld 0x00000001cb73de4c start + 2240 (dyldMain.cpp:1298)