Post not yet marked as solved
Sometimes when I call AudioWorkIntervalCreate the call hangs with the following stacktrace. The call is made on the main thread.
mach_msg2_trap 0x00007ff801f0b3ce
mach_msg2_internal 0x00007ff801f19d80
mach_msg_overwrite 0x00007ff801f12510
mach_msg 0x00007ff801f0b6bd
HALC_Object_AddPropertyListener 0x00007ff8049ea43e
HALC_ProxyObject::HALC_ProxyObject(unsigned int, unsigned int, unsigned int, unsigned int) 0x00007ff8047f97f2
HALC_ProxyObjectMap::_CreateObject(unsigned int, unsigned int, unsigned int, unsigned int) 0x00007ff80490f69c
HALC_ProxyObjectMap::CopyObjectByObjectID(unsigned int) 0x00007ff80490ecd6
HALC_ShellPlugIn::_ReconcileDeviceList(bool, bool, std::__1::vector<unsigned int, std::__1::allocator<unsigned int>>&, std::__1::vector<unsigned int, std::__1::allocator<unsigned int>>&) 0x00007ff8045d68cf
HALB_CommandGate::ExecuteCommand(void () block_pointer) const 0x00007ff80492ed14
HALC_ShellObject::ExecuteCommand(void () block_pointer) const 0x00007ff80470f554
HALC_ShellPlugIn::ReconcileDeviceList(bool, bool) 0x00007ff8045d6414
HALC_ShellPlugIn::ConnectToServer() 0x00007ff8045d74a4
HAL_HardwarePlugIn_InitializeWithObjectID(AudioHardwarePlugInInterface**, unsigned int) 0x00007ff8045da256
HALPlugInManagement::CreateHALPlugIn(HALCFPlugIn const*) 0x00007ff80442f828
HALSystem::InitializeDevices() 0x00007ff80442ebc3
HALSystem::CheckOutInstance() 0x00007ff80442b696
AudioObjectAddPropertyListener_mac_imp 0x00007ff80469b431
auoop::WorkgroupManager_macOS::WorkgroupManager_macOS() 0x00007ff8040fc3d5
auoop::gWorkgroupManager() 0x00007ff8040fc245
AudioWorkIntervalCreate 0x00007ff804034a33
Post not yet marked as solved
Hey all!
I'm building a Camera app using AVFoundation, and I am using AVCaptureVideoDataOutput and AVCaptureAudioDataOutput delegates. (I cannot use AVCaptureMovieFileOutput because I am doing some processing inbetween)
When recording the audio CMSampleBuffers to the AVAssetWriter, I noticed that compared to the stock iOS camera app, they are mono-audio, not stereo audio.
I wonder how recording in stereo audio works, are there any guides or documentation available for that?
Is a stereo audio frame still one CMSampleBuffer, or will it be multiple CMSampleBuffers? Do I need to synchronize them? Do I need to set up the AVAssetWriter/AVAssetWriterInput differently?
This is my Audio Session code:
func configureAudioSession(configuration: CameraConfiguration) throws {
ReactLogger.log(level: .info, message: "Configuring Audio Session...")
// Prevent iOS from automatically configuring the Audio Session for us
audioCaptureSession.automaticallyConfiguresApplicationAudioSession = false
let enableAudio = configuration.audio != .disabled
// Check microphone permission
if enableAudio {
let audioPermissionStatus = AVCaptureDevice.authorizationStatus(for: .audio)
if audioPermissionStatus != .authorized {
throw CameraError.permission(.microphone)
}
}
// Remove all current inputs
for input in audioCaptureSession.inputs {
audioCaptureSession.removeInput(input)
}
audioDeviceInput = nil
// Audio Input (Microphone)
if enableAudio {
ReactLogger.log(level: .info, message: "Adding Audio input...")
guard let microphone = AVCaptureDevice.default(for: .audio) else {
throw CameraError.device(.microphoneUnavailable)
}
let input = try AVCaptureDeviceInput(device: microphone)
guard audioCaptureSession.canAddInput(input) else {
throw CameraError.parameter(.unsupportedInput(inputDescriptor: "audio-input"))
}
audioCaptureSession.addInput(input)
audioDeviceInput = input
}
// Remove all current outputs
for output in audioCaptureSession.outputs {
audioCaptureSession.removeOutput(output)
}
audioOutput = nil
// Audio Output
if enableAudio {
ReactLogger.log(level: .info, message: "Adding Audio Data output...")
let output = AVCaptureAudioDataOutput()
guard audioCaptureSession.canAddOutput(output) else {
throw CameraError.parameter(.unsupportedOutput(outputDescriptor: "audio-output"))
}
output.setSampleBufferDelegate(self, queue: CameraQueues.audioQueue)
audioCaptureSession.addOutput(output)
audioOutput = output
}
}
This is how I activate the audio session just before I start recording:
let audioSession = AVAudioSession.sharedInstance()
try audioSession.updateCategory(AVAudioSession.Category.playAndRecord,
mode: .videoRecording,
options: [.mixWithOthers,
.allowBluetoothA2DP,
.defaultToSpeaker,
.allowAirPlay])
if #available(iOS 14.5, *) {
// prevents the audio session from being interrupted by a phone call
try audioSession.setPrefersNoInterruptionsFromSystemAlerts(true)
}
if #available(iOS 13.0, *) {
// allow system sounds (notifications, calls, music) to play while recording
try audioSession.setAllowHapticsAndSystemSoundsDuringRecording(true)
}
audioCaptureSession.startRunning()
And this is how I set up the AVAssetWriter:
let audioSettings = audioOutput.recommendedAudioSettingsForAssetWriter(writingTo: options.fileType)
let format = audioInput.device.activeFormat.formatDescription
audioWriter = AVAssetWriterInput(mediaType: .audio, outputSettings: audioSettings, sourceFormatHint: format)
audioWriter!.expectsMediaDataInRealTime = true
assetWriter.add(audioWriter!)
ReactLogger.log(level: .info, message: "Initialized Audio AssetWriter.")
The rest is trivial - I receive CMSampleBuffers of the audio in my delegate's callback, write them to the audioWriter, and it ends up in the .mov file - but it is not stereo, it's mono.
Is there anything I'm missing here?
I'm attempting to record from a device's microphone (under iOS) using AVAudioRecorder. The examples are all quite simple, and I'm following the same method. But I'm getting error messages on attempts to record, and the resulting M4A file (after several seconds of recording) is only 552 bytes long and won't load. Here's the recorder usage:
func startRecording()
{
let settings = [
AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
AVSampleRateKey: 22050,
AVNumberOfChannelsKey: 1,
AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue
]
do
{
recorder = try AVAudioRecorder(url: tempFileURL(), settings: settings)
recorder?.delegate = self
recorder!.record()
recording = true
}
catch
{
recording = false
recordingFinished(success: false)
}
}
The immediate sign of trouble appears to be the following, in the console. Note the 0 bits per channel and irrelevant 8K sample rate:
AudioQueueObject.cpp:1580 BuildConverter: AudioConverterNew returned -50 from: 0 ch, 8000 Hz, .... (0x00000000) 0 bits/channel, 0 bytes/packet, 0 frames/packet, 0 bytes/frame to: 1 ch, 8000 Hz, Int16
A subsequent attempt to load the file into AVAudioPlayer results in:
MP4_BoxParser.cpp:1089 DataSource read failed MP4AudioFile.cpp:4365 MP4Parser_PacketProvider->GetASBD() failed AudioFileObject.cpp:105 OpenFromDataSource failed AudioFileObject.cpp:80 Open failed
But that's not surprising given that it's only 500+ bytes and we had the earlier error. Anybody have an idea here? Every example on the Web shows essentially this exact method.
I've also tried constructing the recorder with
let audioFormat = AVAudioFormat.init(standardFormatWithSampleRate: 44100, channels: 1)
if audioFormat == nil
{
print("Audio format failed.")
}
else
{
do
{
recorder = try AVAudioRecorder(url: tempFileURL(), format: audioFormat!)
...
with mostly the same result. In that case the instantiation error message was the following, which at least mentions the requested sample rate:
AudioQueueObject.cpp:1580 BuildConverter: AudioConverterNew returned -50 from: 0 ch, 44100 Hz, .... (0x00000000) 0 bits/channel, 0 bytes/packet, 0 frames/packet, 0 bytes/frame to: 1 ch, 44100 Hz, Int32
Post not yet marked as solved
how do I add AudioToolbox on xcode 15.2?
Post not yet marked as solved
I'm developing an iOS application that uses Core Audio. When I'm running the app on Silicon Macbook, the first time I call AudioUnitSetProperty the following error is logged:
CARP violation: using HAL semantics (AUIOImpl_Base)
Are others getting this, and is this part of normal process?
I'm also getting AQMEIO_HAL.cpp:862 kAudioDevicePropertyMute returned err 2003332927 when I set kAudioOutputUnitProperty_EnableIO for input.
Post not yet marked as solved
I'm trying to add a USB mic to my Mini runing the latest Sonoma software but it full of crackles. Why isn't it clean?
Post not yet marked as solved
There is a CustomPlayer class and inside it is using the MTAudioProcessingTap to modify the Audio buffer.
Let's say there are instances A and B of the Custom Player class.
When A and B are running, the process of B's MTAudioProcessingTap is stopped and finalize callback is coming up when A finishes the operation and the instance is terminated.
B is still experiencing this with some parts left to proceed. Same code same project is not happening in iOS 17.0 or lower.
At the same time when A is terminated, B can complete the task without any impact on B.
What changes to iOS 17.1 are resulting in these results? I'd appreciate it if you could give me an answer on how to avoid these issues.
let audioMix = AVMutableAudioMix()
var audioMixParameters: [AVMutableAudioMixInputParameters] = []
try composition.tracks(withMediaType: .audio).forEach { track in
let inputParameter = AVMutableAudioMixInputParameters(track: track)
inputParameter.trackID = track.trackID
var callbacks = MTAudioProcessingTapCallbacks(
version: kMTAudioProcessingTapCallbacksVersion_0,
clientInfo: UnsafeMutableRawPointer(
Unmanaged.passRetained(clientInfo).toOpaque()
),
init: { tap, clientInfo, tapStorageOut in
tapStorageOut.pointee = clientInfo
},
finalize: { tap in
Unmanaged<ClientInfo>.fromOpaque(MTAudioProcessingTapGetStorage(tap)).release()
},
prepare: nil,
unprepare: nil,
process: { tap, numberFrames, flags, bufferListInOut, numberFramesOut, flagsOut in
var timeRange = CMTimeRange.zero
let status = MTAudioProcessingTapGetSourceAudio(tap,
numberFrames,
bufferListInOut,
flagsOut,
&timeRange,
numberFramesOut)
if noErr == status {
....
}
})
var tap: Unmanaged<MTAudioProcessingTap>?
let status = MTAudioProcessingTapCreate(kCFAllocatorDefault,
&callbacks,
kMTAudioProcessingTapCreationFlag_PostEffects,
&tap)
guard noErr == status else {
return
}
inputParameter.audioTapProcessor = tap?.takeUnretainedValue()
audioMixParameters.append(inputParameter)
tap?.release()
}
audioMix.inputParameters = audioMixParameters
return audioMix
Post not yet marked as solved
Generate tones, etc. in code? Is there a way?
Regards, Patrick
Post not yet marked as solved
Hello,
I've encountered a recurring issue while trying to play back live streams using AVPlayer in an iOS app. The video stream is being delivered via Amazon Kinesis Video Streams (KVS) using HLS.
The specific issue is that audio frequently gets interrupted during playback. The video continues to play just fine, but the audio stops. This issue seems to occur only on iOS devices and not on other platforms or players.
When I check the console logs, I see a number of error messages that may be related to the issue:
2023-05-11 20:57:27.494719+0200 Development[53868:24121620] HALPlugIn::DeviceGetCurrentTime: got an error from the plug-in routine, Error: 1937010544 (stop)
2023-05-11 20:57:27.534340+0200 Development[53868:24121620] [aqme] AQMEIO.cpp:199 timed out after 0.011s (6269 6269); suspension count=0 (IOSuspensions: )
2023-05-11 20:57:30.592067+0200 Development[53868:24122309] HALPlugIn::DeviceGetCurrentTime: got an error from the plug-in routine, Error: 1937010544 (stop)
2023-05-11 20:57:30.592400+0200 Development[53868:24122309] HALPlugIn::DeviceGetCurrentTime: got an error from the plug-in routine, Error: 1937010544 (stop)
I've attempted to troubleshoot this issue in various ways, including trying different iOS devices and networks. I've also attempted to use VLC's player on iOS, which doesn't have the audio interruption issue, but it does encounter other problems.
I believe there might be some compatibility issue between AVPlayer and KVS. I've posted a similar issue on the Amazon KVS GitHub repo but I am reaching out here to see if anyone has faced a similar issue with AVPlayer and has found a solution or can provide some guidance.
Has anyone encountered this issue before, or does anyone have suggestions on how to address it? Any help would be greatly appreciated!
Post not yet marked as solved
We've been doing the following in our app for years without issues:
[[NSSound soundSystem:@"Basso"] play]
Suddenly we're seeing hundreds of crashes from macOS 14.0 users and we're not sure what's causing this. There are no memory leaks within the app and all the stack traces are around NSSound:
0 AudioToolbox 0x1f558 MEDeviceStreamClient::RemoveRunningClient(AQIONodeClient&, bool, bool) + 3096
1 AudioToolbox 0x1e8fc AQMEDevice::RemoveRunningClient(AQIONodeClient&, bool) + 108
2 AudioToolbox 0x1e854 AQMixEngine_Base::RemoveRunningClient(AQIONodeClient&, bool) + 76
3 AudioToolbox 0xcdd78 AudioQueueObject::StopRunning(AQIONode*, bool) + 244
4 AudioToolbox 0xcbdd0 AudioQueueObject::Stop(bool, bool, int*) + 736
5 AudioToolbox 0xf1840 AudioQueueXPC_Server::Stop(unsigned int, bool) + 172
6 AudioToolbox 0x1418b4 ___ZN20AudioQueueXPC_Bridge4StopEjb_block_invoke + 72
7 libdispatch.dylib 0x3910 _dispatch_client_callout + 20
8 libdispatch.dylib 0x130f8 _dispatch_sync_invoke_and_complete_recurse + 64
9 AudioToolbox 0x141844 AudioQueueXPC_Bridge::Stop(unsigned int, bool) + 184
10 AudioToolbox 0xa09b0 AQ::API::V2Impl::AudioQueueStop(OpaqueAudioQueue*, unsigned char) + 492
11 AVFAudio 0xbe12c AVAudioPlayerCpp::disposeQueue(bool) + 188
12 AVFAudio 0x341dc -[AudioPlayerImpl dealloc] + 72
13 AVFAudio 0x358a0 -[AVAudioPlayer dealloc] + 36
14 AppKit 0x1b13b4 -[NSAVAudioPlayerSoundEngine dealloc] + 44
15 AppKit 0x1b132c -[NSSound dealloc] + 164
16 libobjc.A.dylib 0xf418 AutoreleasePoolPage::releaseUntil(objc_object**) + 196
17 libobjc.A.dylib 0xbaf0 objc_autoreleasePoolPop + 260
18 CoreFoundation 0x3c57c _CFAutoreleasePoolPop + 32
19 Foundation 0x30e88 -[NSAutoreleasePool drain] + 140
20 Foundation 0x31f94 _NSAppleEventManagerGenericHandler + 92
21 AE 0xbd8c _AppleEventsCheckInAppWithBlock + 13808
22 AE 0xb6b4 _AppleEventsCheckInAppWithBlock + 12056
23 AE 0x4cc4 aeProcessAppleEvent + 488
24 HIToolbox 0x402d4 AEProcessAppleEvent + 68
25 AppKit 0x3a29c _DPSNextEvent + 1440
26 AppKit 0x80db94 -[NSApplication(NSEventRouting) _nextEventMatchingEventMask:untilDate:inMode:dequeue:] + 716
27 AppKit 0x2d43c -[NSApplication run] + 476
28 AppKit 0x4708 NSApplicationMain + 880
29 ??? 0x180739058 (Missing)
this crash only occus on iOS 17.0.
Crash Stack:
#136 Thread
SIGSEGV
SEGV_ACCERR
0 AudioToolbox AQ::API::V2Impl::AllocateBuffer(OpaqueAudioQueue*, void*, unsigned int, AudioStreamPacketDescription*, unsigned int, AudioQueueBuffer**) + 772
1 AudioToolbox AQ::API::V2Impl::AllocateBuffer(OpaqueAudioQueue*, void*, unsigned int, AudioStreamPacketDescription*, unsigned int, AudioQueueBuffer**) + 584
related codes:
- (instancetype)init {
if (self = [super init]) {
// settings
int Channels = 2;
int bytesPerFrame = 2 * Channels;
AudioStreamBasicDescription streamDesc;
streamDesc.mSampleRate = 44100;
streamDesc.mFormatID = kAudioFormatLinearPCM;
streamDesc.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
streamDesc.mBytesPerPacket = bytesPerFrame;
streamDesc.mFramesPerPacket = 1;
streamDesc.mBytesPerFrame = bytesPerFrame;
streamDesc.mChannelsPerFrame = Channels;
streamDesc.mBitsPerChannel = 16;
streamDesc.mReserved = 0;
// queue
AudioQueueNewOutput(&streamDesc, AEAudioQueueOutputCallback, (__bridge void * _Nullable)(self), nil, nil, 0, &_playQueue);
AudioQueueSetParameter(_playQueue, kAudioQueueParam_Volume, 1.0);
// buffers
for (int i = 0; i < QUEUE_BUFFER_SIZE; i++) {
AudioQueueAllocateBuffer(_playQueue, MIN_SIZE_PER_FRAME, _bufferList+i);
}
}
return self;
}
Post not yet marked as solved
Hi!
I am working on an audio application on iOS. This is how I retreive the workgroup from the remoteIO audiounit (ioUnit). The unit is initialized and is working fine (meaning that it is regularly called by the system).
os_workgroup_t os_workgroup{nullptr};
uint32_t os_workgroup_index_size;
if (status = AudioUnitGetProperty(ioUnit, kAudioOutputUnitProperty_OSWorkgroup, kAudioUnitScope_Global, 0,
&os_workgroup, &os_workgroup_index_size);
status != noErr)
{
throw runtime_error("AudioUnitSetProperty kAudioOutputUnitProperty_OSWorkgroup - Failed with OSStatus: " +
to_string(status));
}
However the resulting os_workgroup's value is 0x40. Which seems not correct. No wonder I cannot join any other realtime threads to the workgroup as well. The returned status however is a solid 0.
Can anyone help?
Post not yet marked as solved
Hello developers,
we have an issue with opening an Apple MPEG-4 audio file that apparently has a correct header but then no actual audio data. This file is 594 bytes and freezes completely the app's main thread and never returns from either of these calls:
NSURL *fileURL = [NSURL fileURLWithPath:filePath];
NSError *error;
AVAudioPlayer *audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:fileURL error:&error]; // freez (call stack below)
AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading:fileURL error:&error]; // freez
AudioFileID audioFileID;
OSStatus result = AudioFileOpenURL((__bridge CFURLRef)fileURL, kAudioFileReadPermission, 0, &audioFileID); // freez
Putting the debugger in pause reveals where it is stuck:
#0 0x00007ff81b7683f9 in MP4BoxParser_Track::GetSampleTableBox() ()
#1 0x00007ff81b76785a in MP4BoxParser_Track::GetInfoFromTrackSubBoxes() ()
#2 0x00007ff81b93fde5 in MP4AudioFile::UseAudioTrack(void*, unsigned int, unsigned int) ()
#3 0x00007ff81b93ab2c in MP4AudioFile::OpenFromDataSource() ()
#4 0x00007ff81b72ee85 in AudioFileObject::Open(__CFURL const*, signed char, int) ()
#5 0x00007ff81b72ed9d in AudioFileObject::DoOpen(__CFURL const*, signed char, int) ()
#6 0x00007ff81b72e1f0 in AudioFileOpenURL ()
#7 0x00007ffa382e8183 in -[AVAudioPlayer initWithContentsOfURL:fileTypeHint:error:] ()
With either of 3 calls the call stack is a little bit different but all in the end get stuck forever in MP4BoxParser_Track::GetSampleTableBox()
I'm attaching the incriminated audio file to the post (just rename it back to .m4a):
Audio_21072023_10462282.crash
How can we avoid this and verify that an audio file is openable and playable. Before, we were checking if a file that we belive be an audio contains data inside, if true then we create AVAudioPlayer with it and see if it return no errors and if the duration is >0. This bug breaks this fondamental logic and now we added a hotfix hack to check if the data is at least 600 bytes long. How do we correctly solve this if none of the methods above return any error but instead ALL hang?
Post not yet marked as solved
I keep getting this error and I'm stuck for days. Any advice I cannot figure this out Asset validation failed Missing Info.plist value. A value for the Info.plist key 'CFBundleIconName' is missing in the bundle Apps built with iOS 11 or later SDK must supply app icons in an asset catalog and must also provide a value for this Info.plist key.
Post not yet marked as solved
I am trying to setup a kAudioUnitSubType_VoiceProcessingIO audio unit for a VoIP macOS app. I tried kAudioUnitSubType_HALOutput first, but it was not suitable due to echo and noise, which makes sense.
kAudioUnitSubType_VoiceProcessingIO seemed promising. However, when using it with a bluetooth headset for output and built-in mic for input, volume gets really low compared to other device setups such as both internal input and output, or both bluetooth headset input and output. The alternative setups seem to work fine, but we can't request users to avoid specific setups. This makes kAudioUnitSubType_VoiceProcessingIO unusable for a macOS app.
Is kAudioUnitSubType_VoiceProcessingIO production ready for current macOS apps? Is there any way to avoid the volume issues?
Adding manual gain is not a workaround because voice becomes really distorted.
Thanks.
Post not yet marked as solved
Hi, I'm working hard with Logic Pro and it's the 4th time that the application crashes. This is report I receive. What can I do to fix?
Thank you in advance
Translated Report (Full Report Below)
Process: Logic Pro X [1433]
Path: /Applications/Logic Pro X.app/Contents/MacOS/Logic Pro X
Identifier: com.apple.logic10
Version: 10.7.7 (5762)
Build Info: MALogic-5762000000000000~2 (1A85)
App Item ID: 634148309
App External ID: 854029738
Code Type: X86-64 (Native)
Parent Process: launchd [1]
User ID: 501
Date/Time: 2023-07-01 09:16:42.7422 +0200
OS Version: macOS 13.3.1 (22E261)
Report Version: 12
Bridge OS Version: 7.4 (20P4252)
Anonymous UUID: F5E0021C-707D-3E26-12BC-6E1D779A746A
Time Awake Since Boot: 2700 seconds
System Integrity Protection: enabled
Crashed Thread: 0 Dispatch queue: com.apple.main-thread
Exception Type: EXC_BAD_ACCESS (SIGSEGV)
Exception Codes: KERN_INVALID_ADDRESS at 0x0000000000000010
Exception Codes: 0x0000000000000001, 0x0000000000000010
Termination Reason: Namespace SIGNAL, Code 11 Segmentation fault: 11
Terminating Process: exc handler [1433]
VM Region Info: 0x10 is not in any region. Bytes before following region: 140737486778352
REGION TYPE START - END [ VSIZE] PRT/MAX SHRMOD REGION DETAIL
UNUSED SPACE AT START
--->
shared memory 7fffffe7f000-7fffffe80000 [ 4K] r-x/r-x SM=SHM
Thread 0 Crashed:: Dispatch queue: com.apple.main-thread
0 Logic Pro X 0x108fe6972 0x108a75000 + 5708146
1 Logic Pro X 0x108def2d3 0x108a75000 + 3646163
2 Foundation 0x7ff80e4b3f35 __NSFireDelayedPerform + 440
3 CoreFoundation 0x7ff80d623478 CFRUNLOOP_IS_CALLING_OUT_TO_A_TIMER_CALLBACK_FUNCTION + 20
4 CoreFoundation 0x7ff80d622ff3 __CFRunLoopDoTimer + 807
5 CoreFoundation 0x7ff80d622c19 __CFRunLoopDoTimers + 285
6 CoreFoundation 0x7ff80d608f79 __CFRunLoopRun + 2206
7 CoreFoundation 0x7ff80d608071 CFRunLoopRunSpecific + 560
8 HIToolbox 0x7ff817070fcd RunCurrentEventLoopInMode + 292
9 HIToolbox 0x7ff817070dde ReceiveNextEventCommon + 657
10 HIToolbox 0x7ff817070b38 _BlockUntilNextEventMatchingListInModeWithFilter + 64
11 AppKit 0x7ff81069a7a0 _DPSNextEvent + 858
12 AppKit 0x7ff81069964a -[NSApplication(NSEvent) _nextEventMatchingEventMask:untilDate:inMode:dequeue:] + 1214
13 Logic Pro X 0x10a29885d 0x108a75000 + 25311325
14 MAToolKit 0x1117f0e37 0x1116ec000 + 1068599
15 MAToolKit 0x1117f64ae 0x1116ec000 + 1090734
16 AppKit 0x7ff8108864b1 -[NSWindow(NSEventRouting) _handleMouseDownEvent:isDelayedEvent:] + 4330
17 AppKit 0x7ff8107fdcef -[NSWindow(NSEventRouting) _reallySendEvent:isDelayedEvent:] + 404
18 AppKit 0x7ff8107fd93f -[NSWindow(NSEventRouting) sendEvent:] + 345
19 Logic Pro X 0x108ebf486 0x108a75000 + 4498566
20 AppKit 0x7ff8107fc319 -[NSApplication(NSEvent) sendEvent:] + 345
21 Logic Pro X 0x10a2995f4 0x108a75000 + 25314804
22 Logic Pro X 0x10a2990c9 0x108a75000 + 25313481
23 Logic Pro X 0x10a29337f 0x108a75000 + 25289599
24 Logic Pro X 0x10a29962e 0x108a75000 + 25314862
25 Logic Pro X 0x10a2990c9 0x108a75000 + 25313481
26 AppKit 0x7ff810ab6bbe -[NSApplication _handleEvent:] + 65
27 AppKit 0x7ff81068bcdd -[NSApplication run] + 623
28 AppKit 0x7ff81065fed2 NSApplicationMain + 817
29 Logic Pro X 0x10956565d 0x108a75000 + 11470429
30 dyld 0x7ff80d1d441f start + 1903
Thread 1:: caulk.messenger.shared:17
0 libsystem_kernel.dylib 0x7ff80d4ef52e semaphore_wait_trap + 10
1 caulk 0x7ff816da707e caulk::semaphore::timed_wait(double) + 150
2 caulk 0x7ff816da6f9c caulk::concurrent::details::worker_thread::run() + 30
3 caulk 0x7ff816da6cb0 void* caulk::thread_proxy<std::__1::tuple<caulk::thread::attributes, void (caulk::concurrent::details::worker_thread::)(), std::__1::tuplecaulk::concurrent::details::worker_thread*>>(void) + 41
4 libsystem_pthread.dylib 0x7ff80d52e1d3 _pthread_start + 125
5 libsystem_pthread.dylib 0x7ff80d529bd3 thread_start + 15
Thread 2:: com.apple.NSEventThread
0 libsystem_kernel.dylib 0x7ff80d4ef5b2 mach_msg2_trap + 10
1 libsystem_kernel.dylib 0x7ff80d4fd72d mach_msg2_internal + 78
2 libsystem_kernel.dylib 0x7ff80d4f65e4 mach_msg_overwrite + 692
3 libsystem_kernel.dylib 0x7ff80d4ef89a mach_msg + 19
4 SkyLight 0x7ff81219f7ac CGSSnarfAndDispatchDatagrams + 160
5 SkyLight 0x7ff8124b8cfd SLSGetNextEventRecordInternal + 284
6 SkyLight 0x7ff8122d8360 SLEventCreateNextEvent + 9
7 HIToolbox 0x7ff81707bfea PullEventsFromWindowServerOnConnection(unsigned int, unsigned char, __CFMachPortBoost*) + 45
8 HIToolbox 0x7ff81707bf8b MessageHandler(__CFMachPort*, void*, long, void*) + 48
9 CoreFoundation 0x7ff80d637e66 __CFMachPortPerform + 244
10 CoreFoundation 0x7ff80d60a5a3 CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE1_PERFORM_FUNCTION + 41
11 CoreFoundation 0x7ff80d60a4e3 __CFRunLoopDoSource1 + 540
12 CoreFoundation 0x7ff80d609161 __CFRunLoopRun + 2694
13 CoreFoundation 0x7ff80d608071 CFRunLoopRunSpecific + 560
14 AppKit 0x7ff8107fa909 _NSEventThread + 132
15 libsystem_pthread.dylib 0x7ff80d52e1d3 _pthread_start + 125
16 libsystem_pthread.dylib 0x7ff80d529bd3 thread_start + 15
Post not yet marked as solved
To my knowledge, you can use the AudioUnitSetProperty function to set the kAUVoiceIOProperty_VoiceProcessingEnableAGC property to disable AGC in AUv2.
However there is no equivalent functionality available in AUv3. The closest option I found is the shouldBypassEffect property.
How to disable AGC using the AUv3 API ?
Post not yet marked as solved
Building an audio streamer macOS app to record system audio and other inputs/outputs: seeking guidance on frameworks and APIs to use. Need suggestions for capturing and saving audio data efficiently. Any recommendations or insights?
Post not yet marked as solved
How can I processing on the AudioQueue callback AudioQueueOutputCallback for FFT .
Just like this func
static void audioQueueOutpuCallBack(void *input, AudioQueueRef inQueue, AudioQueueBufferRef outQueueBuffer)
{
SYAudioQueue *aq = (__bridge SYAudioQueue *)input;
dispatch_semaphore_wait(aq-m_mutex, DISPATCH_TIME_FOREVER);
[aq enterQueue:inQueue withBuffer:outQueueBuffer];
dispatch_semaphore_signal(aq-m_mutex);
}
I know that AVAudioEngine can be processing for FFT under the AVAudioPCMBuffer .
or
How can I convert AudioQueueBufferRef to AVAudioPCMBuffer