The finger joints are rendered as cubes with 6DoF poses:
"FindSurface Real-Time Preview - Apple Vision Pro"
https://youtu.be/CGjhfKxjpUU
The source code of the AVP app is available:
GitHub Link
Post
Replies
Boosts
Views
Activity
Apple Vision Pro
Object planes, spheres, cylinders, cones, and tori can now be effortlessly detected and measured in real-time at up to 120 fps (found-per-second).
It may be going up to 300 fps.
Check out the source code of the AVP app on GitHub CurvSurf for more details:
GitHub Link
The right solution would require obtaining the 6DoF motion tracking information of both the user and the vehicle in real time.
The set of points list[] is a collection of 3D points from:
ARPointCloud, YouTube 4U4FlavRKa4 .
AR, Based on Curved Object Surfaces.
vertices of ARMeshAnchor, YouTube JSNXB3zc4mo .
Fast Cylinder Fitting.
ARDepthData, YouTube zc6GQOtgS7M .
Real Time Ball Tracking in depthMap.
// ViewController.swift
// ln:592. 'else {' should be altered to 'else if (pickIdx < 0) {'
else if (pickIdx < 0) {
// This will accelerate the picking process. Once a point is found inside the "view cone", i.e., pickIdx >= 0, we do not care pickIdxExt.
pickIdx = pickIdxExt = -1;
if (there is at least one point inside the "view cone")
the closest point to ray_pos with the smallest len1 is picked as pickIdx = idx; // pickIdx >= 0
else if (there is at least one point outside the "view cone")
the closest point to the "view circle" on screen with the largest cosine value is picked as pickIdxExt = idx; // pickIdxExt >= 0
return pickIdx < 0 ? pickIdxExt : pickIdx; // return (-1) means that no point is picked or list[] is empty.
There are numerous object surfaces around us. Accurately determining their shape, size, position and posture in real time is a major problem. Essentially, this is a problem we are all trying to solve.
The video "Nose extraction - iPhone X, SR300" https://youtu.be/eCmVYl3GIEY is really old one.
Since 2020 Apple LiDAR and depthMap are available, there new technical possibilities are realized and demonstrated.
YouTube CurvSurf
https://www.youtube.com/CurvSurf
GitHub CurvSurf
https://github.com/CurvSurf
Sorry...
Now the YouTube video is now visible, i.e., you only need to know the URL.
Furthermore, even in totally dark environments where the RGB images required for motion tracking cannot be obtained, however 576 points of LiDAR are obtained. By processing LiDAR's 576 such points in real time, you can pinpoint the shape, size, location, and orientation of a real-world object's surface front of you, even in the total darkness. 576 points of LiDAR must have no concern about privacy problem. But will bring much positive uses.
If the shape, size, location, and orientation of the surface of a real object can be accurately known by processing ARMeshAnchor in real time, various AR applications are possible. Accurate real-time calculations ensure that such real objects can serve as moving anchors for virtual objects.
YouTube "Virtual ads inside Chungmuro station Line 3 - iPhone Pro 12" BmKNmZCiMkw
YouTube "Rolling Ball - Apple iPad Pro LiDAR" sMRfX334blI