Posts

Post not yet marked as solved
1 Replies
1.4k Views
I need to get the microphone output in a certain format that isn't equal to the hw format. To do this, I'm creating a AVAudioMixerNode which will have that format as it's output. However, I never receive any buffers when installing the tap on the mixer node. I thought the inputNode would flow upstream? Am I doing something wrong? Note that I'm not using AudioUnits or AudioQueues because I need to do some frequency filtering on the actual audio stream and thought this was the easiest way to do it.Here's the code:mixerNode = [[AVAudioMixerNode alloc]init]; //Attach node [theEngine attachNode:mixerNode]; //Then connect inputNode (mic) to mixer node [theEngine connect:theEngine.inputNode to:mixerNode format:[theEngine.inputNode outputFormatForBus:0]]; [theEngine startAndReturnError:&theError]; //Now set up the real audio format i want AudioStreamBasicDescription audioFormat; audioFormat.mSampleRate = 8000; audioFormat.mChannelsPerFrame = numberOfChannels; audioFormat.mFormatID = kAudioFormatLinearPCM; audioFormat.mFormatFlags = kLinearPCMFormatFlagIsSignedInteger; audioFormat.mBitsPerChannel = 16; audioFormat.mBytesPerPacket = audioFormat.mBytesPerFrame = (audioFormat.mBitsPerChannel / 8) * audioFormat.mChannelsPerFrame; audioFormat.mFramesPerPacket = 1; //Now install the tap on the mixer so we get the correct format [mixerNode installTapOnBus:0 bufferSize:4096 format:[[AVAudioFormat alloc]initWithStreamDescription:&audioFormat] block:^(AVAudioPCMBuffer * _Nonnull buffer, AVAudioTime * _Nonnull when) { NSLog(@"got buff"); }];
Posted
by jml5qh.
Last updated
.
Post not yet marked as solved
1 Replies
1.8k Views
I am trying to use AudioUnits to take the Mic input, convert it to a different sample rate, and filter the any frequencies above 4000 hz. To do this, I try to have the following connection:RemoteIO -> Convert to Effects formats -> Filter frequencies using kAudioUnitSubType_LowPassFilter -> Convert to final formatI will then take the output and send it as a network stream over RTP. Because of this, I want to use the RenderCallback. My confusion is what do I set the callback on and what do I call AudioUnitRender on? From what I can tell, I have to set the callback on the RemoteIO unit. But then, do I call AudioUnitRender on that RemoteIO unit or do I call it on one of the other units? When I call it on other units, I get errors. When I call it on the RemoteIO unit, I do get audio but it's not clear to me if anything is getting filtered or not.Here is my code://This is the component for the Mic input AudioComponentDescription desc; desc.componentType = kAudioUnitType_Output; desc.componentSubType = kAudioUnitSubType_RemoteIO; desc.componentManufacturer = kAudioUnitManufacturer_Apple; desc.componentFlags = 0; desc.componentFlagsMask = 0; AudioComponent inputComponent = AudioComponentFindNext(NULL, &desc); CheckError(AudioComponentInstanceNew(inputComponent, &theAudioUnit), "Instance AU"); UInt32 flag = 1; CheckError(AudioUnitSetProperty(theAudioUnit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Input, 1, &flag, 1), "EnableIO"); //This is the final audio format I will use AudioStreamBasicDescription audioFormat; audioFormat.mSampleRate = 8000; //Need a sample rate of 8000 audioFormat.mChannelsPerFrame = numberOfChannels; audioFormat.mFormatID = kAudioFormatLinearPCM; audioFormat.mFormatFlags = kLinearPCMFormatFlagIsSignedInteger; audioFormat.mBitsPerChannel = 16; audioFormat.mBytesPerPacket = audioFormat.mBytesPerFrame = (audioFormat.mBitsPerChannel / 8) * audioFormat.mChannelsPerFrame; audioFormat.mFramesPerPacket = 1; //Here is the effect filter for frequencies above 4000 AudioComponentDescription lowPass; lowPass.componentType = kAudioUnitType_Effect; lowPass.componentSubType = kAudioUnitSubType_LowPassFilter; lowPass.componentManufacturer = kAudioUnitManufacturer_Apple; lowPass.componentFlags = 0; lowPass.componentFlagsMask = 0; //This is the converter from Mic input to effect format AudioComponentDescription converter; converter.componentType = kAudioUnitType_FormatConverter; converter.componentSubType = kAudioUnitSubType_AUConverter; converter.componentManufacturer = kAudioUnitManufacturer_Apple; converter.componentFlags = 0; converter.componentFlagsMask = 0; AudioComponent converterComponent = AudioComponentFindNext(nil, &converter); CheckError(AudioComponentInstanceNew(converterComponent, &converterUnit), "Converter inst"); AudioComponent effectComponent = AudioComponentFindNext(nil, &lowPass); CheckError(AudioComponentInstanceNew(effectComponent, &lowPassUnit), "Effect Inst"); //Here is the audio format from the Mic AudioStreamBasicDescription theFormat; memset (&theFormat, 0, sizeof (theFormat)); UInt32 sizeofDesc = sizeof(AudioStreamBasicDescription); CheckError(AudioUnitGetProperty(theAudioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 1, &theFormat, &sizeofDesc), "Get IO Format"); //Now I set it as the input for the converter CheckError(AudioUnitSetProperty(converterUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 0, &theFormat, sizeof(AudioStreamBasicDescription)), "Set Converter In fomrat"); //Here is the audio format from the Effect AudioStreamBasicDescription effectFormat; memset(&effectFormat, 0, sizeof(effectFormat)); CheckError(AudioUnitGetProperty(lowPassUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 0, &effectFormat,&sizeofDesc), "Get Lowpass Format"); //Now i set that as the output for the converter CheckError(AudioUnitSetProperty(converterUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 0, &effectFormat, sizeof(AudioStreamBasicDescription)), "Set cConverter out format"); AudioUnitConnection connection; connection.sourceAudioUnit = theAudioUnit; connection.sourceOutputNumber = 1; connection.destInputNumber = 0; //Here I connect the Mic input to the converter CheckError(AudioUnitSetProperty(converterUnit, kAudioUnitProperty_MakeConnection, kAudioUnitScope_Input, 0, &connection, sizeof(AudioUnitConnection)), "Make Conn audio to convert"); AudioUnitConnection lowPassConnection; lowPassConnection.sourceAudioUnit = converterUnit; lowPassConnection.sourceOutputNumber = 0; lowPassConnection.destInputNumber = 0; //Here I connect the converter to the Effect filter CheckError(AudioUnitSetProperty(lowPassUnit, kAudioUnitProperty_MakeConnection, kAudioUnitScope_Input, 0, &lowPassConnection, sizeof(AudioUnitConnection)), "Make Conn conv to low pass"); //Here I set the Frequency cutoff Float32 lowPassFreq = 3999; CheckError(AudioUnitSetParameter(lowPassUnit, kLowPassParam_CutoffFrequency, kAudioUnitScope_Global, 0, lowPassFreq, sizeof(Float32)), "Set Low Pass freq"); //Here will be the converter from the Filter to the final output format I need AudioComponentDescription outConverter; outConverter.componentType = kAudioUnitType_FormatConverter; outConverter.componentSubType = kAudioUnitSubType_AUConverter; outConverter.componentManufacturer = kAudioUnitManufacturer_Apple; outConverter.componentFlags = 0; outConverter.componentFlagsMask = 0; AudioComponent outConverterComponent = AudioComponentFindNext(nil, &outConverter); CheckError(AudioComponentInstanceNew(outConverterComponent, &outConverterUnit), "Out Converter inst"); //Set the input of this to be the Effect format CheckError(AudioUnitSetProperty(outConverterUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 0, &effectFormat, sizeof(AudioStreamBasicDescription)), "Out Convreter set input format"); //Set the output of this to be the final audio format I need CheckError(AudioUnitSetProperty(outConverterUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 0, &audioFormat, sizeof(AudioStreamBasicDescription)), "Out Converter output format"); AudioUnitConnection outConnection; outConnection.sourceAudioUnit = lowPassUnit; outConnection.sourceOutputNumber = 0; outConnection.destInputNumber = 0; //Set the connection between effect and converter CheckError(AudioUnitSetProperty(outConverterUnit, kAudioUnitProperty_MakeConnection, kAudioUnitScope_Input, 0, &outConnection, sizeof(AudioUnitConnection)), "Make connection out conv to low pass"); //Initialize everything AudioUnitInitialize(theAudioUnit); AudioUnitInitialize(converterUnit); AudioUnitInitialize(lowPassUnit); AudioUnitInitialize(outConverterUnit); AURenderCallbackStruct callbackStruct; callbackStruct.inputProc = recordingCallback; callbackStruct.inputProcRefCon = (__bridge void * _Nullable)(self); //I need a render callback, so I set it on the Mic input CheckError(AudioUnitSetProperty(theAudioUnit, kAudioOutputUnitProperty_SetInputCallback, kAudioUnitScope_Global, 1, &callbackStruct, sizeof(AURenderCallbackStruct)), "Set input callback"); //And start the Mic input CheckError(AudioOutputUnitStart(theAudioUnit), "Out converter start");And here is the Render Callback: AudioBufferList bufferList; bufferList.mNumberBuffers = 1; bufferList.mBuffers[0].mData = NULL; NSLog(@"Number frames: %u, bus nubmer: %u", (unsigned int)inBusNumber, (unsigned int)inNumberFrames); //This is where I'm confused. What should the unit be here? Is it the audio unit? Converter unit? OSStatus status = AudioUnitRender(theAudioUnit, ioActionFlags, inTimeStamp, 1, inNumberFrames, &bufferList);
Posted
by jml5qh.
Last updated
.
Post not yet marked as solved
1 Replies
901 Views
I was working with some Apple Engineers today but we ran out of time in the session. They asked that I post the feedback number with some information: FB7763992. We were trying to figure out why I was having slow incremental compilation. In our testing, we found 1.4gb of .d files - the Apple Engineer believed the processing of these files slowed down the Xcode build process.
Posted
by jml5qh.
Last updated
.
Post not yet marked as solved
1 Replies
381 Views
I was working with two Apple Engineers during the labs today but ran out of time. They asked that I post this Feedback number with some information from our session. They believed there was an Xcode issue where skipped compile jobs were showing up incorrectly in the build output.
Posted
by jml5qh.
Last updated
.
Post not yet marked as solved
1 Replies
504 Views
On all iOS 13 versions, I'm able to execute a "Restricted While Locked" Siri Shortcut even while the device is locked. Is this a known issue in iOS 13? I added a sample project to FB7381961 to demonstrate this issue.
Posted
by jml5qh.
Last updated
.
Post not yet marked as solved
3 Replies
3.2k Views
According to https://developer.apple.com/videos/play/wwdc2019/705/, apps who want to support region monitoring should request always permissions on iOS 13. The user will only initially be prompted to choose "When in Use". The user shouldthen receive a 2nd alert once the device crosses a region to confirm they want an app to user their location in the background.However, on all iOS 13 betas, I am unable to ever get that dialog. This means that our app never receives the region crossing. To reproduce: 1. Request Always locations permission on iOS 13 2. Accept the prompt with "While in Use" and start monitoring a region 3. Cross the region with the device 4. Expected behavior is you will eventually get a dialog box asking the user to allow this app to use your location in the background. 5. Current behavior is you never recieve the dialog box and the app never receives region crossed delegate methods I have attached a basic sample project in FB7216407. I am on the latest iOS 13 beta and Xcode 10.2.1
Posted
by jml5qh.
Last updated
.