Posts

Post not yet marked as solved
0 Replies
520 Views
Hi. I'm currently working on a project, where I try to achieve the best possible UX when rendering an interactive scene using custom Metal renderer. The problem I'm facing right now is that from what I've observed, the timing of how touch events (in touchesBegan/Moved/Ended/Cancelled) arrive with regards to DisplayLink callbacks is different on different devices using latest stable iOS. Disregarding special cases (fast touches movement causing two touchesMoved along with a single touchedEnded arriving during a frame interval and having awkward timing), the general picture is this: on 14 Pro: vsync - (8 ms of nothing) - touches - vsync on 11 Pro Max: vsync - touches - (16 ms of nothing) - vsync In order to minimize input-to-presentation latency and fit CPU + GPU work into one frame, I'd like to schedule drawing right when CADisplayLink callback arrives. The problem is that at this point on 11 Pro Max I have no knowledge about whether touches arrive or not, so the best we could do is to process those touches and schedule redraw on next vsync, which causes almost two frames latency between input and draw that reflects changes caused by this input (33 ms) On 14 Pro, touches are dispatched just before next vsync, so the actual latency is just one 120hz frame (8ms). What are possible workarounds to achieve the same UX on iPhone 11 Pro Max?
Posted Last updated
.
Post not yet marked as solved
2 Replies
896 Views
Did someone figure out the algorithm behind cameraGrainTexture usage like SceneKit does it? I work with raw Metal and can't figure how to use it properly.
Posted Last updated
.