Unable FPS with Remote IO Audio Unit on iPhone 7

Hi!


[Sorry, posting again. I was unable to reply in the original post due to forum internal error.]


We have a strange problem with sound playback in our game. We use single Remote I/O unit for sound output and mix all the sounds by ourselves for cross platform reasons.


Then we discovered strange FPS spikes when the game renders almost static frame on iPhone 7, iOS 10.1.1. If we disable sound completelly (no audio units created at all) - spikes disapear. After that we enabled sound back, but removed all the code from unit's Render callback and left only memset to fill buffers with silence:


  AudioBuffer *pAudioUnitBuffer0 = &ioData->mBuffers[0];
  AudioBuffer *pAudioUnitBuffer1 = &ioData->mBuffers[1];
  int Size = pAudioUnitBuffer0->mDataByteSize;
  memset(pAudioUnitBuffer0->mData, 0, Size);
  memset(pAudioUnitBuffer1->mData, 0, Size);
  return noErr;


Spikes appeared again. Then we tried disable audio unit and added code to play long single buffer with OpenAL - no spikes!

How is that possible that high level API gives less overhead than lowest level API - Remove I/O Audio Unit?


The bug only reproduces on iPhone 7 iOS 10.1.1 device for us. We're not sure yet: is this an iOS 10.1.1 bug or iPhone 7 device bug.

Used Remote I/O stream format:


  AudioStreamBasicDescription FormatDesc;
  memset(&FormatDesc, 0, sizeof(FormatDesc));
  FormatDesc.mFormatID = kAudioFormatLinearPCM;
  FormatDesc.mFormatFlags = 0;
  FormatDesc.mSampleRate = Freq;
  FormatDesc.mChannelsPerFrame = 2;
  FormatDesc.mBitsPerChannel = 32;
  FormatDesc.mFramesPerPacket = 1;
  FormatDesc.mBytesPerPacket = FormatDesc.mBitsPerChannel / 8;
  FormatDesc.mBytesPerFrame = FormatDesc.mBitsPerChannel / 8;
  // "We're back to non-interleaved 32-bit floats on both platforms"
  // http://asciiwwdc.com/2014/sessions/501
  FormatDesc.mFormatFlags |= kAudioFormatFlagIsFloat;
  FormatDesc.mFormatFlags |= kLinearPCMFormatFlagIsNonInterleaved;
  FormatDesc.mFormatFlags |= kAudioFormatFlagsNativeEndian;
  FormatDesc.mFormatFlags |= kAudioFormatFlagIsPacked;
  if (AudioUnitSetProperty(
      m_AudioUnit,
      kAudioUnitProperty_StreamFormat,
      kAudioUnitScope_Input,
      0,
      (void *) &FormatDesc,
      sizeof(AudioStreamBasicDescription)) != noErr)

...

Andrew.

Replies

What buffer duration are you requesting in your AudioSession setup? (setPreferredIOBufferDuration) Are you requesting an preferred Audio Session or a RemoteIO sample rate different from that of the hardware? (newer iOS devices seem to support hardware rates of 48k sps, older devices 44.1k sps).

Hi!


Thanks for reply!


We used 4096 / 48000 as a setPreferredIOBufferDuration argument, then we almost fixed the bug when changed it to 512 / 48000. We always use native sample rate (hope that there will be no conversion at all somewhere inside Core Audio). FPS spikes was visible even when we rendered black screen with OpenGL and simple FPS graph.


Andrew.

Hi!


I have a similar problem as yours.There are sparks on iPhone 7 devices,but other iPhone devices are normal. Could you please tell me how you solve this problem finally?

I'm also getting weird Audio Unit issues with my new app, again only on the iPhone 7 with iOS 10.


In our case, the audio recorded from the mic input comes out at less bytes per render callback than on other devices, and the CPU usage is excessive. This both distorts the sound playback (the app is a voice chat application) and ruins the user responsiveness of the app.


Cannot find any other references to this issue anywhere. Code is very standard Audio Unit stuff, using the kAudioUnitSubType_VoiceProcessingIO, sample rate 44100, kAudioUnitProperty_MaximumFramesPerSlice set to 4096, and setPreferredIOBufferDuration of 50ms (0.05 seconds).


Here is the audio format:


  audioFormat.mSampleRate = 44100;
  audioFormat.mFormatID = kAudioFormatLinearPCM;
  audioFormat.mFormatFlags = kAudioFormatFlagsNativeFloatPacked | kAudioFormatFlagIsNonInterleaved;
  audioFormat.mBytesPerPacket = sizeof(Float32);
  audioFormat.mFramesPerPacket = 1;
  audioFormat.mBytesPerFrame = sizeof(Float32);
  audioFormat.mChannelsPerFrame = 1;
  audioFormat.mBitsPerChannel = sizeof(Float32) * 8;


Simon.

The native hardware sample rate on an iPhone 7 (and 6s) appears to be 48000 sps. So you may be seeing the effects of a resampler between the ADC and your audio buffers. As you've found, hardware buffers at 48000 sps become shorter render callback buffers after resampling to 44100 sps.


One solution might be to run the Audio Unit at 48000, and, if needed, resample in your own app in another thread using a DSP library.