0 Replies
      Latest reply: Oct 12, 2017 9:09 AM by the_ponpon RSS
      the_ponpon Level 1 Level 1 (0 points)



        I have noticed that, on my iPhone 7, the ARCamera intrinsics matrix changes over time during an AR session.

        I can understand how autofocus affects the focal depth (thus the intrinsics matrix) depending on whether the scene you are looking at is close to the device or far from it.

        However, the intrinsics also changes significantly when switching between portrait and landscape orientation, despite looking at exactly the same scene from exactly the same position.


        I cannot find any reason why this happens.


        I am using the camera intrinsics to cast a ray on my own "3d objects" (that are not anchors nor 3d objects from SceneKit), so I am using both the intrinsics and extrinsics matrices provided with the ARCamera to do directly the intersection computation.

        The resulting intersection of the ray cast with my "3d object" looks correct when being in portrait mode.

        However, there is a shift when being in landscape mode.

        I can understand why, given the remark on the intrinsics done above.

        However, this does not help me to solve properly the ray cast I want to do.


        Note: this does not happen when running the same code on an iPad Pro. The intrinsics matrix is constant regardless of UI orientation.


        Any clarification on why the intrinsics changes depending on UI orientation and what intrinsics I should use to solve my problem would be really welcome.