I understand that the buffer size goes up when the screen goes black on an iOS device. I understand the reason for this is to save energy when there is no reason for low latency because there is no user interaction. I need low latency even when the screen is locked. Is there some api for that is am missing? I can not find it anywhere. Example use: Guitar effects app— feed guitar in with live monitor to headphones. If the device screen goes dark you get ~88 ms of latency.
Sounds like you are referencing the discussion from Q&A 1606 < https://developer.apple.com/library/ios/qa/qa1606/_index.html >
This Q&A is talking about the "default" render size values when only output is running. This I/O render size change is not made when you have Input so it's probably not something that you would have noticed in the specific use case mentioned.
As for API, use the AVAudioSession setPreferredIOBufferDuration property. When set, even when only output is running your preference should be honored. For example, using one of our AUGraph playback only samples (configured with the audio background key) rendering @44.1kHz with a buffer duration of 5ms, resulted in the render callback continuing to receive requests for 256 frames regardless of screen lock, foreground/background state. When that property is not set (thereby using the defaults), the same example showed 1024 with the change to 4096 frames as discussed in the above Q&A.
You should also check out the Q&As about kAudioUnitProperty_MaximumFramesPerSlice < https://developer.apple.com/library/mac/qa/qa1533/_index.html > and setting AVAudioSession Preferences < https://developer.apple.com/library/ios/qa/qa1631/_index.html >