8 Replies
      Latest reply on May 22, 2020 7:01 PM by iami2
      DigitalAudioTom Level 1 Level 1 (0 points)

        Dear all,

         

        I'm trying to get a full understanding of the AVAudioSession, AUGraph and AudioUnit classes in order to build clean and stable audio apps with precisely defined behaviours.

         

        I'm stuck right now on one point : input and output latency (more specifically input latency). Basically, my questions are the following :

        1. Where do the latencies come from ?

        2. On what parameters do they depend ?

        3. How can I reduce them ?

         

        For now, I have noticed that the AudioSession Mode "AVAudioSessionModeMeasurement" results in a very low latency, but with also a very low input volume (and I guess less audio input processing) not really usable for a music app.

         

        On an iPad Air 2, with the built-in microphone :

        - with the AVAudioSessionModeMeasurement, I obtain an input latency of 0.1 ms !

        - with the AVAudioSessionModeDefault, I obtain on input latency of  58ms !

         

        Any tips about my three questions ?

         

        Thank you

         

        Thomas

        • Re: AVAudioSession : understanding and controlling Input/Output latency
          hotpaw2 Level 3 Level 3 (105 points)

          There are likely audio filters and hardware sample buffers that affect latency, but these are opaque to the app and may differ between device hardware models. 

           

          The AVAudioSession preferredBufferDuration setting has an obvious affect on latency.  The actual RemoteIO buffer latency will often vary between foreground and background mode and whether any other audio apps are running.  Latency might also be larger if the RemoteIO buffer sample rate is different from the hardware sample rate.  Don't assume that 44.1 is the hardware sample rate on the newest devices.

           

          You might want to measure the actual input to output latencies (with an oscilliscope, et.al., some reports say 7 to 11 mS actual min) to see if and how the latency numbers you obtain correspond.

            • Re: AVAudioSession : understanding and controlling Input/Output latency
              theanalogkid Apple Staff Apple Staff (610 points)

              AVAudioSessionModeMeasurement removes (or minimizes according to the header comments) all system-supplied signal processing for I/O which makes sense since an application wanting to do any type of measurement requires the cleanest signal. Different routes and modes will indeed change things so don't assume anything about the audio system.

               

              On the system, latency is measured by:

              • Audio Device I/O Buffer Frame Size + Output Safety Offset + Output Stream Latency + Output Device Latency

               

              If you're trying to calculate total roundtrip latency you can add:

              • Input Latency + Input Safety Offset to the above.


              The timestamp you see at the render proc. account for the buffer frame size and the safety offset but the stream and device latencies are not accounted for.

               

              iOS gives you access to the most important of the above information via AVAudioSession and as mentioned you can also use the "preferred" session settings - setPreferredIOBufferDuration and preferredIOBufferDuration for further control.


              / The current hardware input latency in seconds. */

              @property(readonly) NSTimeInterval inputLatency  NS_AVAILABLE_IOS(6_0);


              / The current hardware output latency in seconds. */

              @property(readonly) NSTimeInterval outputLatency  NS_AVAILABLE_IOS(6_0);


              / The current hardware IO buffer duration in seconds. */

              @property(readonly) NSTimeInterval IOBufferDuration  NS_AVAILABLE_IOS(6_0);

               

              Audio Units also have the kAudioUnitProperty_Latency property you can query.