Post

Replies

Boosts

Views

Activity

How does Swift's #if tag work?
I've written some code that can be compiled differently depending using #if *** ... #else ... #endif I then set up two different project targets. In one target under Swift Compiler - Custom Flags / Active Compilation Conditions, I define ***. In the other, I don't. Using the two project targets, I can compile the program two different ways. However, if I introduce an error into a section of code that is going to be ignored, XCode reports an error and won't compile. Does the compiler truly ignore code that is in a failed #if block or does the code end up in the compiled code with a runtime check to not run?
1
0
2.0k
Apr ’22
How do you free an AudioBufferList?
In the AudioBufferList extension, there is a comment above the allocate function     /// The memory should be freed with `free()`.     public static func allocate(maximumBuffers: Int) -> UnsafeMutableAudioBufferListPointer But when I try to call free on the returned pointer, free (buffer) XCode complains: Cannot convert value of type 'UnsafeMutableAudioBufferListPointer' to expected argument type 'UnsafeMutableRawPointer?' How should the pointer be free'd? I tried free (&buffer) XCode didn't complain, but when I ran the code, I got an error in the console. malloc: *** error for object 0x16fdfee70: pointer being freed was not allocated I know the call to allocate was successful. Thanks, Mark
1
0
1.5k
Apr ’22
How to handle different sample rates in AVAudioEngine?
This is on a Mac Mini M1 with OSX Monterey. I am trying to write an audio network using AVAudioEngine as opposed to AUAudioGraph (which I understand is deprecated in favor of AVAudioEngine). My code works properly with AUAudioGraph. The input is a microphone which has a sample rate of 8 kHz. In the render proc, the data is written to a ring buffer. Debugging shows that the render proc is called every 0.064 seconds and writes 512 samples (8000 * 0x064 = 512). The program creates an AVAudioSourceNode. The render block for that node pulls data from the above ring buffer. But debugging shows that it is trying to take 512 samples about every 0.0107 seconds. That works out to 48000 samples per second, which is the output device sample rate. Obviously the ring buffer can't keep up. In the statement connecting the above source node to the AVEngine's mixer node, I specify (at least I think I am) a sample rate of 8000, but it still seems to be running at 48000. let inputFormat = AVAudioFormat( commonFormat: outputFormat.commonFormat, sampleRate: 8000, channels: 1, interleaved: outputFormat.isInterleaved) engine.connect(srcNode, to: mixerNode, fromBus: 0, toBus: 0, format: inputFormat) Also, looking at the microphone input using Audio MIDI Setup shows that microphone format is 8000 Hz, 1 channel 16-bit integer, but when I examine the input format of the AudioNode it is reported as 8000 Hz, 1 channel 32-bit float. The input node is using HAL. Obviously, somewhere in the internals of the node the samples are being converted from 16-bit ints to 32-bit floats. Is there a way to also have the sample rate changed? Am I doing this wrong? The HAL node was used with AUAudioGraph. Is there a different node that should be used with AVAudioEngine? I see that AVAudioEngine has an input node, but it seems if I connect it to the microphone, the input goes straight to the hardware output without going through the mixer node (where I want to mix in other audio sources). The original AUGraph code was modeled after the code in "Learning Core Audio" by Adamson & Avila, which, although it is old (pre-dating Swift and AVAudioEngine), is the only detailed reference on CoreAudio that I have been able to find. Is there a newer reference? Thanks, Mark
1
0
1.5k
Apr ’22
Why is Apple Demo code throwing -10878 (invalid parameter)
I have downloaded the WWDC signal generator example code for 2019 session 510 "What's New in AVAudioEngine." at link When I run it in XCode 13.2 on OSX 12.3 on a M1 Mac Mini , on line 99 let mainMixer = engine.mainMixerNode I get 9 lines 2022-03-30 21:09:19.288011-0400 SignalGenerator[52247:995478] throwing -10878 2022-03-30 21:09:19.288351-0400 SignalGenerator[52247:995478] throwing -10878 2022-03-30 21:09:19.288385-0400 SignalGenerator[52247:995478] throwing -10878 2022-03-30 21:09:19.288415-0400 SignalGenerator[52247:995478] throwing -10878 2022-03-30 21:09:19.288440-0400 SignalGenerator[52247:995478] throwing -10878 2022-03-30 21:09:19.288467-0400 SignalGenerator[52247:995478] throwing -10878 2022-03-30 21:09:19.288491-0400 SignalGenerator[52247:995478] throwing -10878 2022-03-30 21:09:19.288534-0400 SignalGenerator[52247:995478] throwing -10878 2022-03-30 21:09:19.288598-0400 SignalGenerator[52247:995478] throwing -10878 in the console output. -10878 is in valid parameter But the program seems to run as expected. Can this just be ignored, or does it indicate improper setup?
0
0
677
Mar ’22
Are the SDK .h files available in Xcode?
I'm teaching myself CoreAudio programming on the Mac (I have a need). Sometimes the documentation refers to a .h file. Can those files be viewed in Xcode? If so, how do I find them? For example the help page for AudioUnitParameter says this in the Overview: This data structure is used by functions declared in the AudioToolbox/AudioUnitUtilities.h header file in macOS.
1
0
433
Mar ’22
Determine Core Audio Parameter Type
I'm trying to figure out how to set the volume of a CoreAudio AudioUnit. I found the parameter kHALOutputParam_Volume, but I can't find anything about it. I called AudioUnitGetPropertyInfo and that told me that the parameter is 4 bytes long and writeable. How can I find out whether that is an Int32, UInt32, Float32 or some other type and what acceptable values are and mean? I used AudioUnitGetProperty and read it as either Int32 (512) or Float32 (7.17e-43). Is there any documentation on this and other parameters?
1
0
696
Mar ’22
What is the replacement for OSAtomicCompareAndSwap32Barrier?
I am converting the example code in Learning CoreAudio by Adamson & Avila to Swift. In one of the examples, they use Apple's CARingBuffer C++ code. In trying to get that working, I get a warning: OSAtomicCompareAndSwap32Barrier' is deprecated: first deprecated in macOS 10.12 - Use std::atomic_compare_exchange_strong() from <atomic> instead I'm not familiar with C++ and I'm having trouble figuring out how to use atomic_compare_exchange_strong(). I've also had trouble figuring out what OSAtomicCompareAndSwap32Barrier is supposed to do. The only place it is called in CARingBuffer is void CARingBuffer::SetTimeBounds(SampleTime startTime, SampleTime endTime) { UInt32 nextPtr = mTimeBoundsQueuePtr + 1; UInt32 index = nextPtr & kGeneralRingTimeBoundsQueueMask; mTimeBoundsQueue[index].mStartTime = startTime; mTimeBoundsQueue[index].mEndTime = endTime; mTimeBoundsQueue[index].mUpdateCounter = nextPtr; CAAtomicCompareAndSwap32Barrier(mTimeBoundsQueuePtr, mTimeBoundsQueuePtr + 1, (SInt32*)&mTimeBoundsQueuePtr); } The call to CAAtomicCompareAndSwap32Barrier directly calls OSAtomicCompareAndSwap32Barrier. Even with the deprecation warning, the code performs as expected, but I'd like to eliminate the warning.
4
0
1k
Mar ’22
Security Warning
I'm using 13.2 to write SwiftUI apps for personal use on my M1 Mac mini with Monterey 12.2. After upgrading to 12.2, The first time I launched one of my apps, I got two pop-up screens (see attachments) I using a personal team account for development. Rebuilding the app seems to clear things up. Do I have to rebuild apps after every OS upgrade? Mark
1
0
400
Feb ’22
Location Services
With Xcode 13.2, every time I rebuild my Mac application, the first time I run it, it asks for permission to use my current location (the app does use that). With Xcode 13.1, it only asked the first time I ran the app, not every time I rebuilt, now it seems like it thinks this is a new app every time I rebuild it. I looked in Privacy settings and the app is only listed once. Is my Mac filling up with permissions to use my location for each build of my app?
0
0
321
Dec ’21
self is used before all properties are initialized (Swift)
struct ModelDemo {     let callback : () -> () } final class ViewModelDemo {     let modelDemo: ModelDemo     init() {         modelDemo = ModelDemo(callback: self.modelCallback)     }          private func modelCallback() {     } } The above generates the error "'self' used before all stored properties are initialized." I understand the error, but this is a common programming pattern for me in other languages. Is there a preferred way to do something like this in Swift? My current workaround is to make the callback property of ModelDemo into an optional and initialize it to nil, then to set it from the ViewModelDemo init() after it (and other ViewModel properties) have been initialized. FWIW, the intent of the code is to give the Model a way to inform the ViewModel that something in the Model has changed. Since the Model is supposed to be isolated from the View, I don't think I should use @ObservableObject as that's a SwiftUI feature.
4
0
3.8k
Nov ’21
Swift switch statement optimization
When a Swift program is executing a switch statement does it look at each case in turn to see which one to execute? Would it make sense to put the case's more likely to be chosen closer to the top of the statement? With a switch statement on an enum with a large (over 100) number of cases would it make sense to replace the switch with a dictionary of closures?
2
0
385
Nov ’21
CPU Usage Mapkit
I wrote a SwiftUI app for the Mac that uses MapKit to display a map for a given address. It uses geocoding to convert a street address to latitude and longitude. I place a pin on the map at the actual latitude and longitude. I have noticed that the pin appears to move slightly back and forth or up and down while it is displayed. The program isn't doing any other calculations. I have noticed that if the map is not displayed, the Mac Activity Monitor shows that the application is not using any CPU resources, but when the map is displayed, Activity Monitor shows around 10%. I'm not sure what that 10% means because Activity Monitory says that the User is only using 1.5 - 2%. What is the CPU doing? Is it waiting for the user to move or zoom the map? Is there a way to disable that and have just a static map display?
0
0
441
Nov ’21