Despite Apple Documentation (ex. WWDC2015: Advanced Touch Input on iOS) which claims maximum touch latency as 2 frames (32.2ms), this is simply not what I am observing when testing. When I say "maxiumum touch latency", I mean, the maximum time between physically touching the screen, and a result of said touch being displayed on the display screen.
In lieu of a 60Hz video camera, what I've done is predict, by linearly extrapolating touch velocity, where a touch will be in the future, and render that.
Trying my best to move my finger at a constant speed across my iPhone 6, it seems that in order for the rendered image to be "glued" to my finger, that the raw touch inputs need to be extrapolated 4 frames (~64ms) in the future (from when my program is notified of said event). Adding at least 1 more frame for the hardware and OS to collect the multitouch data and make it available to the run loop, this give 5 frames of latency in total.
Rendering is done using OpenGLES 2.0 into a CALayer, which, according to WWDC2015: Advanced Touch Input on iOS, should provide lowest possible latency (I do not beleive OGLES3.0 or Metal would make any difference)
Idealy, I would like a way to more accuratly measure this latency other than this crude experiment. Can anyone else verify this, or have other data sugguesting otherwise?
Thanks