Getting the orientation of a featurePoint raycast

Is there a way to get the orientation/normal where a raycast hits a featurePoint? Like how the cursor in Apples Measure app follows the curvature of worldmap objects (not limited to a horizontal or vertical plane): https://www.youtube.com/watch?v=HbLe4rHQI_I


Thanks

Replies

FeaturePoints don't really have orientations by themselves I believe; they're just points


When you're doing a raycast such as this one, and you pass in something like this for the types:

[.existingPlaneUsingGeometry, .estimatedVerticalPlane, .estimatedHorizontalPlane]


Then check if your result is a vertical or horizontal plane. Or if it's an existingPlane type then check its Anchor is an ARPlaneAnchor; if it is then check its alignment property.


I use something very similar to the video you sent for this open project, which is mostly derived from Apple's samples:

https://github.com/maxxfrazer/ARKit-FocusNode

This seems to only reach vertical and horizontal orientations though - is it possible to get hitresult orientation from the side of a ball? I.e an acute 45° angle from gravity, instead of the horizontal and vertical right angles.


It seems to work in their Measure app but not sure how it's done. Their cursor is so detailed it looks like a featurePoint filter, which as you say should only be a cloud of vectors.

I can’t see where in the measure app it recognises a non horizontal or vertical surface, could you give me a time stamp in the above video? As far as I know ARKit is not yet set up to do that at all. You could get the orientation on the side of an augmented ball by just taking the normal at that point, however ARKit currently won’t tell you that level of detail for real objects.

I did a quick recording myself to show it, here: https://streamable.com/k9mr7


You are probably right, it's not available at this point. Maybe they use the feature points to make a topology mesh and hitcast on that?

So the only way I can think of is getting the feature point at where you’re looking, then get the 2 closest to it. Then take the cross product of the 2 closest points and you *should* get the normal of the point of interest. Not sure exactly how to get the closest feature points to the one you’re interested in but that would be my best guess. Even with that, I don’t think the results are going to be fantastic though, but they don’t look perfect in the video you added anyway