Post

Replies

Boosts

Views

Activity

AVAudioPlayerNode: File not played if scheduled after two or more buffers
Using AVAudioPlayerNode, you can schedule multiple buffers and files, and they will (or at least should) play in sequence. However, I'm finding that some sequences don't work. If you schedule a file after two or more buffers, the file doesn't play. Other combinations work. For example, scheduling a file after only one buffer works, as does scheduling buffers after files, and so on. Here's a self-contained example (except for the audio file) that demonstrates the behavior (this can be pasted over the contents of the 'ViewController.swift' file in the Xcode iOS 'App' template): import AVFoundation import UIKit class ViewController: UIViewController { 		private let engine = AVAudioEngine() 		private let player = AVAudioPlayerNode() 		private var file: AVAudioFile! = nil 		private var buffer: AVAudioPCMBuffer! = nil 		override func viewDidLoad() { 				super.viewDidLoad() 				let url = Bundle.main.resourceURL!.appendingPathComponent("audio.m4a") 				 				file = try! AVAudioFile(forReading: url) 				buffer = AVAudioPCMBuffer( 						pcmFormat: file.processingFormat, 						frameCapacity: AVAudioFrameCount(file.length) 				) 				 				try! file.read(into: buffer) 				engine.attach(player) 				engine.connect( 						player, to: engine.mainMixerNode, format: file.processingFormat 				) 				 				try! engine.start() 				 				player.play() 				player.scheduleBuffer(buffer) 				player.scheduleBuffer(buffer) 				player.scheduleFile(file, at: nil) 		} } It seems like this should play the sound three times, but it only plays it twice. Is this a bug? Or is there something about the API I'm not understanding?
0
0
1.1k
Nov ’20
Status bar style not applied during launch on iPad - UIKit bug?
In the general settings for an Xcode project, you can set 'Status Bar Style' to Default, Dark Content, or Light Content. If I'm not mistaken, whatever you choose there should be applied during the launch screen. For example, if you select Light Content, the status bar should be light during the launch screen, irrespective of whether the system appearance is 'Light' or 'Dark'. This seems to work correctly on iPhones, but on iPads, the selected status bar style doesn't seem to be applied during the launch screen (most of the time at least - occasionally it works correctly). For example, if the system appearance is 'Light' and you set the status bar style to Light Content, on iPads you'll get a dark status bar, contrary to the selection. Testing on different iOS/iPadOS versions suggests this behavior is new as of 14.0 (on 13.5, the status bar style is applied correctly). Is this a UIKit bug? Or does it sound like user error? Can anyone confirm or disconfirm this behavior?
1
0
522
Oct ’20
Xcode iOS game template missing launch storyboard?
If I create a project from the iOS 'Game' template and build it, I get the following warning: "A launch configuration or launch storyboard or xib must be provided unless the app requires full screen." The short version of my question is, assuming you're not using SwiftUI, isn't a launch storyboard more or less required irrespective of whether the app requires fullscreen? And why is the launch storyboard missing in the Xcode game template? In case it's helpful, here's some more detail. I'll admit I'm not sure what 'launch configuration' refers to in the warning message. In any case, Apple says - https://developer.apple.com/design/human-interface-guidelines/ios/visual-design/launch-screen/ every app should have a launch screen, and recommends that a storyboard be used for this (again, assuming no SwiftUI). But the way the warning is worded seems to suggest that a fullscreen app doesn't need a launch storyboard. Why would that be? Is the warning worded incorrectly? Apple also says - https://developer.apple.com/documentation/uikit/app_and_environment/responding_to_the_launch_of_your_app that "Xcode projects automatically include a default launch storyboard for you to customize". So why doesn't the game template include one? Is it an error in the template? As far as I can tell, the problem is easily fixed by creating a launch screen storyboard file and an appropriate 'Launch screen interface file base name' plist entry. But, I'd like to understand better what the warning means and why the game template is configured the way it is. Thanks for any help.
0
0
785
Oct ’20
NSLog() producing interleaved output in Xcode console in a multithreaded context
I'm using Xcode 11.2.1 and macOS 10.15.1. Using the following self-contained example code (this replaces the contents of 'ViewController.swift' in Xcode's 'Single View App' template):import UIKit class CustomThread: Thread { override func main() { while true { NSLog("test") } } } class ViewController: UIViewController { private let thread0 = CustomThread() private let thread1 = CustomThread() override func viewDidLoad() { super.viewDidLoad() thread0.start() thread1.start() } }I occasionally get output like this in the Xcode console:20192019-12-08 14:39:42.525873-0600 my_app[80118:12471303] test -12-08 14:39:42.525186-0600 my_app[80118:12471302] testWherein messages logged from different threads are interleaved with one another. This is on the simulator. I also tried it with a device running iPadOS 13.2.3, but wasn't able to reproduce the behavior there.If I use print() instead of NSLog(), I don't see any interleaving.The documentation for print() doesn't appear to address thread safety. In the Thread Programming Guide, NSLog() is listed as being 'generally considered to be thread-safe'. In the documentation for NSLogv(), it says:"Output from NSLogv is serialized, in that only one thread in a process can be doing the writing/logging described above at a time. All attempts at writing/logging a message complete before the next thread can begin its attempts.The effects of NSLogv are not serialized with subsystems other than those discussed above (such as the standard I/O package) and do not produce side effects on those subsystems (such as causing buffered output to be flushed, which may be undesirable)."I'm not entirely clear as to what the second paragraph quoted above means, but I'm wondering if NSLog() being serialized and thread safe doesn't necessarily mean its output is guaranteed to be atomic in the Xcode console.Is the interleaving in the Xcode console expected behavior? If so, what's the recommended way to log atomically to the Xcode console? Should I use a lock? Or is print() guaranteed to be atomic and therefore preferable?
1
0
1.1k
Dec ’19
AVAudioEngine thread-safety
I've been wrestling with a problem related to AVAudioEngine. I've posted a couple questions recently to other forums, but haven't had much luck, so I'm guessing not many people are encountering this, or the questions are unclear, or perhaps I'm not asking in the most appropriate subforums. So, I thought I'd try here in the 'Concurrency' forum, as using concurrency would be one way to solve the problem.The specific problem is that AVAudioPlayerNode.play() takes a long time to execute. The execution time seems to be proportional to the value of AVAudioSession.ioBufferDuration, and can be from a few milliseconds for low buffer durations to over 20 milliseconds at the default buffer duration. These execution times can be an issue in real-time applications such as games.An obvious solution would be to move such operations to a background thread using GCD, and I've seen various posts and articles that do this. Here's a code example showing what I mean:import AVFoundation import UIKit class ViewController: UIViewController { private let engine = AVAudioEngine() private let player = AVAudioPlayerNode() private let queue = DispatchQueue(label: "", qos: .userInitiated) override func viewDidLoad() { super.viewDidLoad() engine.attach(player) engine.connect(player, to: engine.mainMixerNode, format: nil) try! engine.start() } override func touchesBegan(_ touches: Set, with event: UIEvent?) { queue.async { if self.player.isPlaying { self.player.stop() } else { self.player.play() } } } }In this scenario all AVAudioEngine-related operations would be serial and never concurrent/parallel. The audio system would never be accessed simultaneously from multiple threads, only serially.My concern is that I don't know whether it's safe to use AVAudioEngine in this way. More generally, I'm not sure what should be assumed about any API for which nothing specific is said about thread safety. In such cases, can it be assumed that access from multiple threads is safe as long as only one thread is active at any given time? (The 'Thread Programming Guide' touches on this, but doesn't appear to address audio frameworks specifically.)The narrowest version of my question is whether it's safe to use GCD with AVAudioEngine, provided all access is serial. The broader question would be what assumptions should or should not be made about APIs and thread safety when it's not specifically addressed in the documentation.Any input on either of these issues would be greatly appreciated.
1
2
2.3k
Sep ’19
Matching colors between launch screen storyboard and Metal
I have a launch screen storyboard that's simply a single solid color. Once the app launches, I begin rendering with Metal, clearing the screen using the same color. The goal is for the colors to match exactly between the launch screen and Metal. However, the launch screen color that's actually rendered on the device is different from what's specified in the storyboard, causing the color to differ between the launch screen and Metal.As an example, say the color is:102 255 0If I use this as the clear color for Metal, take a screenshot on the device, and check the color in an image editing program, I get (102 255 0), as desired. However, the same color specified in the storyboard using the sRGB color profile yields an actual color in the screenshot of (101 254 0). Obviously the difference is small, but nevertheless I'd like the values to match exactly.I understand that iOS's goal here is consistent perceived color and not necessarily preserving the exact numerical color values. In this case though, I'm not particularly concerned with differences in perceived color between devices. I just want the color to match exactly between the launch screen and Metal.I've tried a number of other profiles (Device RGB, Generic RGB, etc.), but they all result in the actual rendered color values being different from those specified in Interface Builder. (This is on an iPad Mini 2 running 12.1.4.)What I'm looking for is to be able to specify a background color for the launch screen storyboard in Interface Builder such that the exact numerical color values are preserved without alteration. Is there a way to do this? Is there a color profile option I've missed that simply conveys the color values exactly without modifying them?
4
0
5.0k
Mar ’19