My background audio app stops updating its Live Activity after the iPhone locks, and doesn't resume updating the activity after tapping the screen or even after FaceID unlocks the device (without opening the lock screen).
My live activity requests a ContentState update & iOS updates the content for the activity as below:
Task{
log.debug("LiveActivityManager.updateLiveActivity() with new ContentState")
await liveActivity.update( ActivityContent(state:contentState, staleDate:nil) )
}
Below what my log looks like:
<<<<SWIPE LOCK SCREEN DOWN>>>>
DEBUG: LiveActivityManager.updateLiveActivity() with new ContentState
iOS: Updating content for activity 0A519263-1E46-4BB6-BA4F-F3DDBC081AB4
DEBUG: LiveActivityManager.updateLiveActivity() with new ContentState
iOS: Updating content for activity 0A519263-1E46-4BB6-BA4F-F3DDBC081AB4
<<<<PRESS LOCK BUTTON->Lock iPhone>>>>
INFO: --------protectedDataWillBecomeUnavailableNotification--------
DEBUG: LiveActivityManager.updateLiveActivity() with new ContentState
iOS: Updating content for activity 0A519263-1E46-4BB6-BA4F-F3DDBC081AB4
DEBUG: LiveActivityManager.updateLiveActivity() with new ContentState
DEBUG: LiveActivityManager.updateLiveActivity() with new ContentState
DEBUG: LiveActivityManager.updateLiveActivity() with new ContentState
<<<<LOOK AT & TAP LOCK SCREEN->Unlock iPhone without swiping up>>>>
INFO: --------protectedDataDidBecomeAvailableNotification-----------
DEBUG: LiveActivityManager.updateLiveActivity() with new ContentState
DEBUG: LiveActivityManager.updateLiveActivity() with new ContentState
DEBUG: LiveActivityManager.updateLiveActivity() with new ContentState
As shown in the log, normally iOS updates the content for my activity after my liveActivity.update request.
This works fine in the Dynamic Island and when after switching apps and swiping down to see the lock screen without locking the phone.
However, once I lock the phone, iOS stops updating the Live Activity content, and doesn't resume updates until after the app regains the foreground at least once.
Has anyone else encountered this behavior? Is this a setting that I'm missing, or a bug?
Post
Replies
Boosts
Views
Activity
I'm hoping someone can help me understand some unexpected behavior in a @MainActor Function which internally calls a task that calls a method on a background actor.
Normally, the function would call the task, pause until the task completes, and finish the end of the function.
However, when the function is annotated @Main actor, the internal task appears to become detached and execute asynchronously such that it finishes after the @MainActor function.
The code below demonstrates this behavior in a playground:
actor SeparateActor{
func actorFunc(_ str:String){
print("\tActorFunc(\(str))")
}
}
class MyClass{
var sa = SeparateActor()
@MainActor func mainActorFunctionWithTask(){
print("mainActorFunctionWithTask Start")
Task{
await self.sa.actorFunc("mainActorFunctionWithTask")
}
print("mainActorFunctionWithTask End")
}
func normalFuncWithTask(){
print("normalFuncWithTask Start")
Task{
await self.sa.actorFunc("normalFuncWithTask")
}
print("normalFuncWithTask End")
}
}
Task{
let mc = MyClass()
print("\nCalling normalFuncWithTask")
mc.normalFuncWithTask()
print("\nCalling mainActorFunctionWithTask")
await mc.mainActorFunctionWithTask()
}
I would expect both the normalFunc and the mainActorFunc to behave the same, with the ActorFunc being called before the end of the task, but instead, my mainActor function completes before the task.
Calling normalFuncWithTask
normalFuncWithTask Start
ActorFunc(normalFuncWithTask)
normalFuncWithTask End
Calling mainActorFunctionWithTask
mainActorFunctionWithTask Start
mainActorFunctionWithTask End
ActorFunc(mainActorFunctionWithTask)
I'm testing my App in the Xcode 14 beta (released with WWDC22) on iOS 16, and it seems that AVSpeechSynthesisVoice is not working correctly.
The following code always returns an empty array:
AVSpeechSynthesisVoice.speechVoices()
Additionally, attempting to initialize AVSpeechSynthesisVoice returns nil for all of the following:
AVSpeechSynthesisVoice(language: AVSpeechSynthesisVoice.currentLanguageCode())
AVSpeechSynthesisVoice(language: "en")
AVSpeechSynthesisVoice(language: "en-US")
AVSpeechSynthesisVoice(identifier: AVSpeechSynthesisVoiceIdentifierAlex)
AVSpeechSynthesisVoice.speechVoices().first
I've noticed that XCTMemoryMetric & XCTCPUMetric seem to record empty or nonsensical data when running a UI test flow for an XCUIApplication.
I'm attempting to test the memory and CPU footprint for a SwiftUI iOS app using the following code:
Swift
func testBasicFlowMemory() throws{
let app = XCUIApplication()
app.launch()
var metrics:[XCTMetric] = []
metrics.append( XCTClockMetric() )
metrics.append( XCTMemoryMetric(application: app) )
metrics.append( XCTCPUMetric(application: app) )
self.measure(metrics: metrics){
/*Method which uses XCUI API to test the app instance*/
self.runBasicFlowTest(app: app)
}
}
When I run the test above, I notice runBasicFlowTest is executed 6 times (even though the metics only record 5 values.
Of the three metrics I wanted to track only XCTClockMetric returned meaningful data:
[Clock Monotonic Time, s] values: [114.728229, 114.944770, 121.813337, 116.394432, 117.491242]
XCTMemoryMetric mostly recorded 0.0 or nonsense data:
[Memory Physical, kB] values: [3596.288000, 0.000000, 0.000000, 0.000000, 0.000000]
[Memory Peak Physical, kB] values: [0.000000, 0.000000, 0.000000, 0.000000, 0.000000]
XCTCPUMetric likewise recorded 0.0 or nonsense data:
[CPU Instructions Retired, kI] values: [0.000000, 206223944.266000, 0.000000, 0.000000, 211895544.471000]
[CPU Cycles, kC] values: [0.000000, 252096240.472000, 0.000000, 0.000000, 257352232.305000],
[CPU Time, s] values: [0.000000, 86.585296, 0.000000, 0.000000, 0.000000]
I'm on Xcode Version 12.4 (12D4e), and my app is targeting iOS 14.4 on a simulated iPhone 11 Pro.
Has anyone had any luck using XCTMetrics with UI Tests?
Testing my app in an iOS 15.5 simulator with Xcode 14 beta, it crashed hard, locking up my entire laptop. After restarting, it seems that the iOS 15.5 simulator runtime disappeared, and I'm now getting the following error:
The com.apple.CoreSimulator.SimRuntime.iOS-15-5 simulator runtime is not available.
runtime profile not found
Download the com.apple.CoreSimulator.SimRuntime.iOS-15-5 simulator runtime from the Components section in Xcode's Preferences.
Unfortunately, Xcode seems to think that I already have the runtime, and no longer gives me the option to download it.
Has anyone figured out how to get SwiftUI to release the memory of a complex view struct which is not being shown?
I'm working on a SwiftUI app where I'd like to switch the root view, from a complex view (which uses a fair bit of memory), to a simpler one (which uses much less memory).
To illustrate this, I've created a basic SwiftUI 2.0 app below which starts with a complex root view and then uses the scene phase transition to switch to a simple "Hello World" view as the app moves into the background.
In simulation, the app starts using 18.5MB of memory to show the complex view. When I sent the app to the background, the root view is switched to the simple view - which should lower the app's memory. Instead the memory footprint goes UP to 19.6MB, which suggests that the complex view isn't being released from memory.
Swift
import SwiftUI
@main struct TestAppBackgroundingApp: App {
@Environment(\.scenePhase) var scenePhase:ScenePhase
@State var showLongList = true
var body: some Scene {
WindowGroup {
/*Change Root view based on bool*/
if( showLongList ){
LongList()
}else{
SimpleView()
}
}
.onChange(of: scenePhase) { (newScenePhase) in
print("Scene Phase Transition: \(scenePhase) -- \(newScenePhase)")
if( scenePhase == .active && newScenePhase == .inactive ){
/*Toggle Root View as app moves to background*/
print("Hiding long list")
DispatchQueue.main.async {
self.showLongList.toggle()
}
}
}
}
}
struct SimpleView: View {
var body: some View {
Text("SimpleView").font(.largeTitle)
}
}
struct LongList: View {
var body: some View {
VStack{
Text("Long List").font(.largeTitle)
List{
ForEach(1...20_000, id: \.self) { i in
HStack{
Image(systemName: "gift.circle.fill").renderingMode(.original).font(.largeTitle)
Text("Row#\(i)").bold()
}
}
}
}//VStack
}//Body
}
I'm hoping that someone could point me towards some concrete steps for reducing the memory footprint of a SwiftUI app as it moves to the background.
I've read Preparing Your UI to Run in the Background - https://developer.apple.com/documentation/uikit/app_and_environment/scenes/preparing_your_ui_to_run_in_the_background; however, this is applicable to UIKit, and I'm not sure how to translate it into SwiftUI.
I'm working on an app that generates audio, and would like it to continue doing so if the user locks their phone or opens another app. My app usually works in the background, but seems to stop when I open a more demanding app in the foreground.
In simulator and device testing, I notice that my application memory remains constant as the app moves to and from the foreground (I also note that my app memory is above the 50 MB threshold recommended in WWDC20 Session 10078)