Post

Replies

Boosts

Views

Activity

How to add an exception to Sandbox?
My MacOS program is printing a message in the terminal window in XCode networkd_settings_read_from_file Sandbox is preventing this process from reading networkd settings file at "/Library/Preferences/com.apple.networkd.plist", please add an exception. How do I do this? I don't see a way to add arbitrary File Access on the Signing & Capabilities tab under app setup. XCode 13.3.1 OSX Monterey 12.3.1 Mac Mini M1
6
2
7.2k
Apr ’22
Are there application classes where UIKit is a better choice than SwiftUI?
I am developing a Mac app that will control an external device (an amateur radio transceiver). The device is connected via ethernet and continually sends data describing various parameters. The display needs to continually update as these parameters change. I have partially executed this in SwiftUI using the Combine Framework to update views My issue is that with my understanding of SwiftUI, whenever a view is updated, SwiftUI recreates sibling views. So for example, if there is a stack with 2 views each corresponding to a different parameter, when one of the parameters is updated, both views are recreated. This isn't a problem with 2 views, but what if there are a large number of parameters (and views). Is there a way to design the app so that only the view for an individual parameter is recreated? The app would be designed such that the size of a view never changed, so the sibling views would not be moving. Or am I misunderstanding how SwiftUI works? Would UIKit be better suited for this, or does it also need to reevaluate constraints (or something else) each time a single view updates? Another issue I haven't been able to figure out with SwiftUI is how to have multiple windows. I'd like to have separate windows for related sets of parameters that the user can open or close and position on the screen as desired. The best I've been able to find in SwiftUI is different windows for different "documents" where the documents are similar to each other. Thanks in advance for any advice. Mark
2
0
442
Apr ’22
CoreAudio Project Posted to Github
I have posted a learning project on github. I have gone through the book "Learning Core Audio" by Adamson & Avila and converted the examples from Objective-C to Swift 5. I hope that this helps others trying to learn CoreAudio and if anyone sees issues in my code, please let me know. Thanks, Mark
0
0
1.1k
Apr ’22
Is there a bug in Developer Help in XCode 13?
With a recent update to XCode 13, something has happened to Developer Help. I am running two identical monitors on my development machine, a M1 Mac Mini (XCode 13.2, OSX 12.3.1) On most apps, if you hold the mouse button down on the green dot button at the top left of the window, there is a dropdown that gives choices: Enter Full Screen Tile Window to Left of Screen Tile Window to Right of Screen Move to CB282K(1) CB282K(1) refers to one of my monitors that is set up as extended display. CB282K(2) is the other one, set up as main display, the one currently displaying the window. Usually, selecting the Move to CB282K(1) option moves the window to the other monitor. This works for most apps, including Xcode. But, if I bring up Developer Help, selecting this option does not move the Developer Help window. If, I manually drag the Developer Help window to the other monitor and select the Move to CB282K(2) option, it moves back to the main monitor. If I then select Move to CB282K(1) the Developer Help window will move and continues to move between monitors as needed. After closing the Developer Help window, if I open it again, it again refuses to move the second monitor until being dragged manually. As I said, I think this used to work. Has anyone else seen this behavior and is there some setting I can change to get it working again? Also, while not relevant to the question, is there a way to change the name of the monitor that shows up in the list. Is there a way I can change the menu to have options "Move to Extended Display" and "Move to Main Display"? Thanks, Mark
0
0
282
Apr ’22
How to set a breakpoint in library code?
When running my code, I got the following messages in the console: malloc: *** error for object 0x16fdfee70: pointer being freed was not allocated malloc: *** set a breakpoint in malloc_error_break to debug I understand the first message is coming from a bug in my code. The second is trying to help me figure out the issue. The question is how do I set this helpful breakpoint? FWIW, I've seen similar "set a breakpoint in ..." messages before in other cases.
1
0
2.2k
Apr ’22
How does Swift's #if tag work?
I've written some code that can be compiled differently depending using #if *** ... #else ... #endif I then set up two different project targets. In one target under Swift Compiler - Custom Flags / Active Compilation Conditions, I define ***. In the other, I don't. Using the two project targets, I can compile the program two different ways. However, if I introduce an error into a section of code that is going to be ignored, XCode reports an error and won't compile. Does the compiler truly ignore code that is in a failed #if block or does the code end up in the compiled code with a runtime check to not run?
1
0
1.8k
Apr ’22
How do you free an AudioBufferList?
In the AudioBufferList extension, there is a comment above the allocate function     /// The memory should be freed with `free()`.     public static func allocate(maximumBuffers: Int) -> UnsafeMutableAudioBufferListPointer But when I try to call free on the returned pointer, free (buffer) XCode complains: Cannot convert value of type 'UnsafeMutableAudioBufferListPointer' to expected argument type 'UnsafeMutableRawPointer?' How should the pointer be free'd? I tried free (&buffer) XCode didn't complain, but when I ran the code, I got an error in the console. malloc: *** error for object 0x16fdfee70: pointer being freed was not allocated I know the call to allocate was successful. Thanks, Mark
1
0
1.4k
Apr ’22
How to handle different sample rates in AVAudioEngine?
This is on a Mac Mini M1 with OSX Monterey. I am trying to write an audio network using AVAudioEngine as opposed to AUAudioGraph (which I understand is deprecated in favor of AVAudioEngine). My code works properly with AUAudioGraph. The input is a microphone which has a sample rate of 8 kHz. In the render proc, the data is written to a ring buffer. Debugging shows that the render proc is called every 0.064 seconds and writes 512 samples (8000 * 0x064 = 512). The program creates an AVAudioSourceNode. The render block for that node pulls data from the above ring buffer. But debugging shows that it is trying to take 512 samples about every 0.0107 seconds. That works out to 48000 samples per second, which is the output device sample rate. Obviously the ring buffer can't keep up. In the statement connecting the above source node to the AVEngine's mixer node, I specify (at least I think I am) a sample rate of 8000, but it still seems to be running at 48000. let inputFormat = AVAudioFormat( commonFormat: outputFormat.commonFormat, sampleRate: 8000, channels: 1, interleaved: outputFormat.isInterleaved) engine.connect(srcNode, to: mixerNode, fromBus: 0, toBus: 0, format: inputFormat) Also, looking at the microphone input using Audio MIDI Setup shows that microphone format is 8000 Hz, 1 channel 16-bit integer, but when I examine the input format of the AudioNode it is reported as 8000 Hz, 1 channel 32-bit float. The input node is using HAL. Obviously, somewhere in the internals of the node the samples are being converted from 16-bit ints to 32-bit floats. Is there a way to also have the sample rate changed? Am I doing this wrong? The HAL node was used with AUAudioGraph. Is there a different node that should be used with AVAudioEngine? I see that AVAudioEngine has an input node, but it seems if I connect it to the microphone, the input goes straight to the hardware output without going through the mixer node (where I want to mix in other audio sources). The original AUGraph code was modeled after the code in "Learning Core Audio" by Adamson & Avila, which, although it is old (pre-dating Swift and AVAudioEngine), is the only detailed reference on CoreAudio that I have been able to find. Is there a newer reference? Thanks, Mark
1
0
1.4k
Apr ’22
Why is Apple Demo code throwing -10878 (invalid parameter)
I have downloaded the WWDC signal generator example code for 2019 session 510 "What's New in AVAudioEngine." at link When I run it in XCode 13.2 on OSX 12.3 on a M1 Mac Mini , on line 99 let mainMixer = engine.mainMixerNode I get 9 lines 2022-03-30 21:09:19.288011-0400 SignalGenerator[52247:995478] throwing -10878 2022-03-30 21:09:19.288351-0400 SignalGenerator[52247:995478] throwing -10878 2022-03-30 21:09:19.288385-0400 SignalGenerator[52247:995478] throwing -10878 2022-03-30 21:09:19.288415-0400 SignalGenerator[52247:995478] throwing -10878 2022-03-30 21:09:19.288440-0400 SignalGenerator[52247:995478] throwing -10878 2022-03-30 21:09:19.288467-0400 SignalGenerator[52247:995478] throwing -10878 2022-03-30 21:09:19.288491-0400 SignalGenerator[52247:995478] throwing -10878 2022-03-30 21:09:19.288534-0400 SignalGenerator[52247:995478] throwing -10878 2022-03-30 21:09:19.288598-0400 SignalGenerator[52247:995478] throwing -10878 in the console output. -10878 is in valid parameter But the program seems to run as expected. Can this just be ignored, or does it indicate improper setup?
0
0
651
Mar ’22
Are the SDK .h files available in Xcode?
I'm teaching myself CoreAudio programming on the Mac (I have a need). Sometimes the documentation refers to a .h file. Can those files be viewed in Xcode? If so, how do I find them? For example the help page for AudioUnitParameter says this in the Overview: This data structure is used by functions declared in the AudioToolbox/AudioUnitUtilities.h header file in macOS.
1
0
377
Mar ’22
Determine Core Audio Parameter Type
I'm trying to figure out how to set the volume of a CoreAudio AudioUnit. I found the parameter kHALOutputParam_Volume, but I can't find anything about it. I called AudioUnitGetPropertyInfo and that told me that the parameter is 4 bytes long and writeable. How can I find out whether that is an Int32, UInt32, Float32 or some other type and what acceptable values are and mean? I used AudioUnitGetProperty and read it as either Int32 (512) or Float32 (7.17e-43). Is there any documentation on this and other parameters?
1
0
612
Mar ’22
What is the replacement for OSAtomicCompareAndSwap32Barrier?
I am converting the example code in Learning CoreAudio by Adamson & Avila to Swift. In one of the examples, they use Apple's CARingBuffer C++ code. In trying to get that working, I get a warning: OSAtomicCompareAndSwap32Barrier' is deprecated: first deprecated in macOS 10.12 - Use std::atomic_compare_exchange_strong() from <atomic> instead I'm not familiar with C++ and I'm having trouble figuring out how to use atomic_compare_exchange_strong(). I've also had trouble figuring out what OSAtomicCompareAndSwap32Barrier is supposed to do. The only place it is called in CARingBuffer is void CARingBuffer::SetTimeBounds(SampleTime startTime, SampleTime endTime) { UInt32 nextPtr = mTimeBoundsQueuePtr + 1; UInt32 index = nextPtr & kGeneralRingTimeBoundsQueueMask; mTimeBoundsQueue[index].mStartTime = startTime; mTimeBoundsQueue[index].mEndTime = endTime; mTimeBoundsQueue[index].mUpdateCounter = nextPtr; CAAtomicCompareAndSwap32Barrier(mTimeBoundsQueuePtr, mTimeBoundsQueuePtr + 1, (SInt32*)&mTimeBoundsQueuePtr); } The call to CAAtomicCompareAndSwap32Barrier directly calls OSAtomicCompareAndSwap32Barrier. Even with the deprecation warning, the code performs as expected, but I'd like to eliminate the warning.
4
0
894
Mar ’22
Security Warning
I'm using 13.2 to write SwiftUI apps for personal use on my M1 Mac mini with Monterey 12.2. After upgrading to 12.2, The first time I launched one of my apps, I got two pop-up screens (see attachments) I using a personal team account for development. Rebuilding the app seems to clear things up. Do I have to rebuild apps after every OS upgrade? Mark
1
0
353
Feb ’22