I have seen this question asked in various contexts since ARKit was published, but never found a straight answer. I am using the ARKit 4 on iPad OS 14 Beta. The issue I am struggling with is when I pull the AR camera's current transform (which is updated for every frame), as follows:
ARSession.currentFrame.camera.transform
...it returns a 4x4 matrix with rotation and translation, but only translation is changing. Rotation is the same for every frame, regardless of camera yaw, pitch, and roll. I was under the impression, from reading documentation on ARWorldTrackingConfiguration that the ARCamera transform relays 6 DOF which includes rotation and translation. Clearly, this is not the case. here are some values:
FRAME 1:
0.9986219 0.034667697 0.039401535
~0.019907176 0.9448786 ~0.3268153
~0.048559602 0.3255805 0.94426656
~0.016681878 ~0.017129783 ~0.0130903125
FRAME 2:
0.9986219 0.034667697 0.039401535
~0.019907176 0.9448786 ~0.3268153
~0.048559602 0.3255805 0.94426656
0.10855365 ~0.06680643 0.9586861
FRAME 3:
0.9986219 0.034667697 0.039401535
~0.019907176 0.9448786 ~0.3268153
~0.048559602 0.3255805 0.94426656
0.8597136 ~1.5562118 0.4360383
I am using tildes instead of minus symbols, and the Apple Developer forum interprets them as bullets (when beginning a line with a minus).
These are looking at an object from 3 different positions (front, right, and left), but rotation remains stable. No heading change. I am obviously missing something fundamental. Why would the AR session return the same rotation matrix for every single frame, regardless of camera tilt and heading? This makes absolutely no sense.
There is a eulerAngles property that can be used, but I'd rather avoid touching it, because as you are likely aware, every platform applies those angles in different order, and it's not always consistent with documentation. A transform matrix is cleaner and easier to port over to any other system.
Perhaps this is a bug in ARKit 4. There is no logical reason why rotation would remain stable. Even in a sequence that moves around 360 degrees, heading / yaw does not budge in the rotation matrix.
I have tried multiplying the camera.transform by viewMatrix and / or projectionMatrix just to see if anything would change -- and it doesn't. There is literally nothing you can with the iPad camera in an AR session that will yield updated rotation matrices.
ARSession.currentFrame.camera.transform
...it returns a 4x4 matrix with rotation and translation, but only translation is changing. Rotation is the same for every frame, regardless of camera yaw, pitch, and roll. I was under the impression, from reading documentation on ARWorldTrackingConfiguration that the ARCamera transform relays 6 DOF which includes rotation and translation. Clearly, this is not the case. here are some values:
FRAME 1:
0.9986219 0.034667697 0.039401535
~0.019907176 0.9448786 ~0.3268153
~0.048559602 0.3255805 0.94426656
~0.016681878 ~0.017129783 ~0.0130903125
FRAME 2:
0.9986219 0.034667697 0.039401535
~0.019907176 0.9448786 ~0.3268153
~0.048559602 0.3255805 0.94426656
0.10855365 ~0.06680643 0.9586861
FRAME 3:
0.9986219 0.034667697 0.039401535
~0.019907176 0.9448786 ~0.3268153
~0.048559602 0.3255805 0.94426656
0.8597136 ~1.5562118 0.4360383
I am using tildes instead of minus symbols, and the Apple Developer forum interprets them as bullets (when beginning a line with a minus).
These are looking at an object from 3 different positions (front, right, and left), but rotation remains stable. No heading change. I am obviously missing something fundamental. Why would the AR session return the same rotation matrix for every single frame, regardless of camera tilt and heading? This makes absolutely no sense.
There is a eulerAngles property that can be used, but I'd rather avoid touching it, because as you are likely aware, every platform applies those angles in different order, and it's not always consistent with documentation. A transform matrix is cleaner and easier to port over to any other system.
Perhaps this is a bug in ARKit 4. There is no logical reason why rotation would remain stable. Even in a sequence that moves around 360 degrees, heading / yaw does not budge in the rotation matrix.
I have tried multiplying the camera.transform by viewMatrix and / or projectionMatrix just to see if anything would change -- and it doesn't. There is literally nothing you can with the iPad camera in an AR session that will yield updated rotation matrices.