Buffer Size while iOS screen is off?

I understand that the buffer size goes up when the screen goes black on an iOS device. I understand the reason for this is to save energy when there is no reason for low latency because there is no user interaction. I need low latency even when the screen is locked. Is there some api for that is am missing? I can not find it anywhere. Example use: Guitar effects app— feed guitar in with live monitor to headphones. If the device screen goes dark you get ~88 ms of latency.

Accepted Reply

Sounds like you are referencing the discussion from Q&A 1606 < https://developer.apple.com/library/ios/qa/qa1606/_index.html >


This Q&A is talking about the "default" render size values when only output is running. This I/O render size change is not made when you have Input so it's probably not something that you would have noticed in the specific use case mentioned.


As for API, use the AVAudioSession setPreferredIOBufferDuration property. When set, even when only output is running your preference should be honored. For example, using one of our AUGraph playback only samples (configured with the audio background key) rendering @44.1kHz with a buffer duration of 5ms, resulted in the render callback continuing to receive requests for 256 frames regardless of screen lock, foreground/background state. When that property is not set (thereby using the defaults), the same example showed 1024 with the change to 4096 frames as discussed in the above Q&A.


You should also check out the Q&As about kAudioUnitProperty_MaximumFramesPerSlice < https://developer.apple.com/library/mac/qa/qa1533/_index.html > and setting AVAudioSession Preferences < https://developer.apple.com/library/ios/qa/qa1631/_index.html >

Replies

Sounds like you are referencing the discussion from Q&A 1606 < https://developer.apple.com/library/ios/qa/qa1606/_index.html >


This Q&A is talking about the "default" render size values when only output is running. This I/O render size change is not made when you have Input so it's probably not something that you would have noticed in the specific use case mentioned.


As for API, use the AVAudioSession setPreferredIOBufferDuration property. When set, even when only output is running your preference should be honored. For example, using one of our AUGraph playback only samples (configured with the audio background key) rendering @44.1kHz with a buffer duration of 5ms, resulted in the render callback continuing to receive requests for 256 frames regardless of screen lock, foreground/background state. When that property is not set (thereby using the defaults), the same example showed 1024 with the change to 4096 frames as discussed in the above Q&A.


You should also check out the Q&As about kAudioUnitProperty_MaximumFramesPerSlice < https://developer.apple.com/library/mac/qa/qa1533/_index.html > and setting AVAudioSession Preferences < https://developer.apple.com/library/ios/qa/qa1631/_index.html >

Hello,


I am having a similar problem. I want to configure a small buffer size (e.g. 256 for 0.005 ms) but iOS wouldn't let me. No matter what I do, it sets the outputNode buffer size to 4096. And I am not even locking the screen.


I am setting up the session like this:


  private let engine = AVAudioEngine()


In init() of my custom class:

        do {
            try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord)
            try AVAudioSession.sharedInstance().setPreferredIOBufferDuration(0.005)

            var result: OSStatus
            var framesPerSlice = UInt32(0)
            var size = UInt32(sizeof(UInt32))
            result = AudioUnitGetProperty(engine.outputNode.audioUnit, kAudioUnitProperty_MaximumFramesPerSlice, kAudioUnitScope_Global, 0, &framesPerSlice, &size);
            if result != noErr {
                 print("AudioUnitGetProperty for outputNode, result=\(result)")
            }
            print("outputNode bufferSize=\(framesPerSlice)")
        }
        catch {
            fatalError("Cannot set Audio Session category or buffer duration")
        }


This is the console output:

AVAudioSession buffer duration=0.0232199542224407 outputLatency=0.013650793582201
outputNode bufferSize=4096



Previously I tried to use AVAudioSessionCategoryPlayback but no change.

Any idea why the buffer size is so stubborn?


I need to play sounds using AVAudioUnitSampler so there is no need for recording. However, in the code above I picked AVAudioSessionCategoryPlayAndRecord because I have read that with output only, the system would set its buffer size to 1024. But I need to go as low as 256.