Posts

Post not yet marked as solved
1 Replies
1.9k Views
I've been wrestling with a problem related to AVAudioEngine. I've posted a couple questions recently to other forums, but haven't had much luck, so I'm guessing not many people are encountering this, or the questions are unclear, or perhaps I'm not asking in the most appropriate subforums. So, I thought I'd try here in the 'Concurrency' forum, as using concurrency would be one way to solve the problem.The specific problem is that AVAudioPlayerNode.play() takes a long time to execute. The execution time seems to be proportional to the value of AVAudioSession.ioBufferDuration, and can be from a few milliseconds for low buffer durations to over 20 milliseconds at the default buffer duration. These execution times can be an issue in real-time applications such as games.An obvious solution would be to move such operations to a background thread using GCD, and I've seen various posts and articles that do this. Here's a code example showing what I mean:import AVFoundation import UIKit class ViewController: UIViewController { private let engine = AVAudioEngine() private let player = AVAudioPlayerNode() private let queue = DispatchQueue(label: "", qos: .userInitiated) override func viewDidLoad() { super.viewDidLoad() engine.attach(player) engine.connect(player, to: engine.mainMixerNode, format: nil) try! engine.start() } override func touchesBegan(_ touches: Set, with event: UIEvent?) { queue.async { if self.player.isPlaying { self.player.stop() } else { self.player.play() } } } }In this scenario all AVAudioEngine-related operations would be serial and never concurrent/parallel. The audio system would never be accessed simultaneously from multiple threads, only serially.My concern is that I don't know whether it's safe to use AVAudioEngine in this way. More generally, I'm not sure what should be assumed about any API for which nothing specific is said about thread safety. In such cases, can it be assumed that access from multiple threads is safe as long as only one thread is active at any given time? (The 'Thread Programming Guide' touches on this, but doesn't appear to address audio frameworks specifically.)The narrowest version of my question is whether it's safe to use GCD with AVAudioEngine, provided all access is serial. The broader question would be what assumptions should or should not be made about APIs and thread safety when it's not specifically addressed in the documentation.Any input on either of these issues would be greatly appreciated.
Posted
by Jesse1.
Last updated
.
Post not yet marked as solved
0 Replies
966 Views
Using AVAudioPlayerNode, you can schedule multiple buffers and files, and they will (or at least should) play in sequence. However, I'm finding that some sequences don't work. If you schedule a file after two or more buffers, the file doesn't play. Other combinations work. For example, scheduling a file after only one buffer works, as does scheduling buffers after files, and so on. Here's a self-contained example (except for the audio file) that demonstrates the behavior (this can be pasted over the contents of the 'ViewController.swift' file in the Xcode iOS 'App' template): import AVFoundation import UIKit class ViewController: UIViewController { 		private let engine = AVAudioEngine() 		private let player = AVAudioPlayerNode() 		private var file: AVAudioFile! = nil 		private var buffer: AVAudioPCMBuffer! = nil 		override func viewDidLoad() { 				super.viewDidLoad() 				let url = Bundle.main.resourceURL!.appendingPathComponent("audio.m4a") 				 				file = try! AVAudioFile(forReading: url) 				buffer = AVAudioPCMBuffer( 						pcmFormat: file.processingFormat, 						frameCapacity: AVAudioFrameCount(file.length) 				) 				 				try! file.read(into: buffer) 				engine.attach(player) 				engine.connect( 						player, to: engine.mainMixerNode, format: file.processingFormat 				) 				 				try! engine.start() 				 				player.play() 				player.scheduleBuffer(buffer) 				player.scheduleBuffer(buffer) 				player.scheduleFile(file, at: nil) 		} } It seems like this should play the sound three times, but it only plays it twice. Is this a bug? Or is there something about the API I'm not understanding?
Posted
by Jesse1.
Last updated
.
Post marked as solved
4 Replies
4.6k Views
I have a launch screen storyboard that's simply a single solid color. Once the app launches, I begin rendering with Metal, clearing the screen using the same color. The goal is for the colors to match exactly between the launch screen and Metal. However, the launch screen color that's actually rendered on the device is different from what's specified in the storyboard, causing the color to differ between the launch screen and Metal.As an example, say the color is:102 255 0If I use this as the clear color for Metal, take a screenshot on the device, and check the color in an image editing program, I get (102 255 0), as desired. However, the same color specified in the storyboard using the sRGB color profile yields an actual color in the screenshot of (101 254 0). Obviously the difference is small, but nevertheless I'd like the values to match exactly.I understand that iOS's goal here is consistent perceived color and not necessarily preserving the exact numerical color values. In this case though, I'm not particularly concerned with differences in perceived color between devices. I just want the color to match exactly between the launch screen and Metal.I've tried a number of other profiles (Device RGB, Generic RGB, etc.), but they all result in the actual rendered color values being different from those specified in Interface Builder. (This is on an iPad Mini 2 running 12.1.4.)What I'm looking for is to be able to specify a background color for the launch screen storyboard in Interface Builder such that the exact numerical color values are preserved without alteration. Is there a way to do this? Is there a color profile option I've missed that simply conveys the color values exactly without modifying them?
Posted
by Jesse1.
Last updated
.
Post not yet marked as solved
1 Replies
453 Views
In the general settings for an Xcode project, you can set 'Status Bar Style' to Default, Dark Content, or Light Content. If I'm not mistaken, whatever you choose there should be applied during the launch screen. For example, if you select Light Content, the status bar should be light during the launch screen, irrespective of whether the system appearance is 'Light' or 'Dark'. This seems to work correctly on iPhones, but on iPads, the selected status bar style doesn't seem to be applied during the launch screen (most of the time at least - occasionally it works correctly). For example, if the system appearance is 'Light' and you set the status bar style to Light Content, on iPads you'll get a dark status bar, contrary to the selection. Testing on different iOS/iPadOS versions suggests this behavior is new as of 14.0 (on 13.5, the status bar style is applied correctly). Is this a UIKit bug? Or does it sound like user error? Can anyone confirm or disconfirm this behavior?
Posted
by Jesse1.
Last updated
.
Post not yet marked as solved
0 Replies
683 Views
If I create a project from the iOS 'Game' template and build it, I get the following warning: "A launch configuration or launch storyboard or xib must be provided unless the app requires full screen." The short version of my question is, assuming you're not using SwiftUI, isn't a launch storyboard more or less required irrespective of whether the app requires fullscreen? And why is the launch storyboard missing in the Xcode game template? In case it's helpful, here's some more detail. I'll admit I'm not sure what 'launch configuration' refers to in the warning message. In any case, Apple says - https://developer.apple.com/design/human-interface-guidelines/ios/visual-design/launch-screen/ every app should have a launch screen, and recommends that a storyboard be used for this (again, assuming no SwiftUI). But the way the warning is worded seems to suggest that a fullscreen app doesn't need a launch storyboard. Why would that be? Is the warning worded incorrectly? Apple also says - https://developer.apple.com/documentation/uikit/app_and_environment/responding_to_the_launch_of_your_app that "Xcode projects automatically include a default launch storyboard for you to customize". So why doesn't the game template include one? Is it an error in the template? As far as I can tell, the problem is easily fixed by creating a launch screen storyboard file and an appropriate 'Launch screen interface file base name' plist entry. But, I'd like to understand better what the warning means and why the game template is configured the way it is. Thanks for any help.
Posted
by Jesse1.
Last updated
.
Post not yet marked as solved
0 Replies
919 Views
I'm using Xcode 11.2.1 and macOS 10.15.1. Using the following self-contained example code (this replaces the contents of 'ViewController.swift' in Xcode's 'Single View App' template):import UIKit class CustomThread: Thread { override func main() { while true { NSLog("test") } } } class ViewController: UIViewController { private let thread0 = CustomThread() private let thread1 = CustomThread() override func viewDidLoad() { super.viewDidLoad() thread0.start() thread1.start() } }I occasionally get output like this in the Xcode console:20192019-12-08 14:39:42.525873-0600 my_app[80118:12471303] test -12-08 14:39:42.525186-0600 my_app[80118:12471302] testWherein messages logged from different threads are interleaved with one another. This is on the simulator. I also tried it with a device running iPadOS 13.2.3, but wasn't able to reproduce the behavior there.If I use print() instead of NSLog(), I don't see any interleaving.The documentation for print() doesn't appear to address thread safety. In the Thread Programming Guide, NSLog() is listed as being 'generally considered to be thread-safe'. In the documentation for NSLogv(), it says:"Output from NSLogv is serialized, in that only one thread in a process can be doing the writing/logging described above at a time. All attempts at writing/logging a message complete before the next thread can begin its attempts.The effects of NSLogv are not serialized with subsystems other than those discussed above (such as the standard I/O package) and do not produce side effects on those subsystems (such as causing buffered output to be flushed, which may be undesirable)."I'm not entirely clear as to what the second paragraph quoted above means, but I'm wondering if NSLog() being serialized and thread safe doesn't necessarily mean its output is guaranteed to be atomic in the Xcode console.Is the interleaving in the Xcode console expected behavior? If so, what's the recommended way to log atomically to the Xcode console? Should I use a lock? Or is print() guaranteed to be atomic and therefore preferable?
Posted
by Jesse1.
Last updated
.
Post not yet marked as solved
1 Replies
484 Views
I understand that apps should launch quickly and spend as little time as is practical in functions like application(_:didFinishLaunchingWithOptions:) and viewDidLoad().Not counting audio setup, the amount of synchronous work I'm doing during launch is probably less than 100 milliseconds, which seems acceptable (although maybe I'm wrong and even that's too much).However, I'm encountering a problem with audio setup due to the AVAudioPlayerNode.play() function. AVAudioPlayerNode.play() takes a long time to execute. It depends on the environment of course, but it seems to take roughly in the 10-30 millisecond range per call, depending.If you have many AVAudioPlayerNode instances, this adds up. On my test device (an iPad Mini 2 running iOS 12.4.2), starting 16 players takes around 370 milliseconds. This is an old device of course, but on the simulator it still takes around 200 milliseconds. My computer is fairly old as well, but other AVAudioEngine-related operations seem to perform fine. It seems to be specifically AVAudioPlayerNode.play() that's the problem. (For those who might not be familiar with AVAudioEngine, AVAudioPlayerNode.play() isn't a function you call every time you want to play a sound - it simply 'activates' the player, and therefore only needs to be called during initial setup and in a few other isolated cases.)For clarity, here's some example code showing what I'm talking about:import AVFoundation import QuartzCore import UIKit class ViewController: UIViewController { private let engine = AVAudioEngine() private let players = (0 ..< 16).map { _ in AVAudioPlayerNode() } override func viewDidLoad() { super.viewDidLoad() let format = AVAudioFormat(standardFormatWithSampleRate: 44100.0, channels: 2) for player in players { engine.attach(player) engine.connect(player, to: engine.mainMixerNode, format: format) } try! engine.start() let startTime = CACurrentMediaTime() for player in players { player.play() } print("\(CACurrentMediaTime() - startTime)") } }In my app, the time it takes to start the players, plus the other work that needs to be done, could lead to around 400 milliseconds of synchronous work during launch.- Can I get away with doing 400 milliseconds of synchronous work during launch? Or is that too much?- If it's acceptable to do this much work, would it be best to do it in application(_:didFinishLaunchingWithOptions:), viewDidLoad(), or in some other location?- Is there any other straightforward way I could address the AVAudioPlayerNode.play() execution time problem? (There might be heavyweight solutions, like moving the entire audio system to a separate thread, but I'm trying to avoid such complexities if I can.)
Posted
by Jesse1.
Last updated
.
Post not yet marked as solved
0 Replies
771 Views
I'd like to move some operations involving AVAudioPlayer - and possibly AVAudioEngine, although that's less important - off the main thread using a serial dispatch queue or similar mechanism. Here's a simple self-contained example of what I'm talking about:import AVFoundation import UIKit class ViewController: UIViewController { let player = try! AVAudioPlayer( contentsOf: Bundle.main.resourceURL!.appendingPathComponent("audio.m4a") ) let queue = DispatchQueue(label: "", qos: .userInteractive) override func touchesBegan(_ touches: Set, with event: UIEvent?) { queue.async { if self.player.isPlaying { self.player.stop() self.player.currentTime = 0.0 } else { self.player.play() } } } }In this example, an AVAudioPlayer instance is created on the main thread, and then subsequent operations are performed on other threads using a serial queue.Is it safe to use AVAudioPlayer, or related tools like AVAudioEngine, in this way? Again, all operations on these objects would be serial, never concurrent.
Posted
by Jesse1.
Last updated
.