Post

Replies

Boosts

Views

Activity

Height above ground
I'm using the ARKit + Corelocation github project as a starting point for a project that will show the location of an underwater feature, looking from the beach. So I am projecting some marker buoys onto the water. It works, I can project the 2D markers, but they float way above or below the ground level, which makes them look like they are actually in the wrong spot. Is there some trick to locking the bottom of the image to ground level? I'm using elevation = 0. If I change the 2D icon the distance above or below ground level changes.
1
0
778
Sep ’20
Depth Scale -- Photogrammetry
An earlier post on photogrammetry models not having proper real-world scale includes a response from Apple that says: Make sure you are actually reading these files in your code to create the PhotogrammetrySample. If you simply keep the *_depth.tiff files in the input folder, but do not read the depthDataMap while creating the PhotogrammetrySample, then these depth files will not be used. The sample photgrammetry app does not include creating a photogrammetry sample, only supplying a folder of images with depth data included. How do we go about "reading the DepthDataMap"? To be clear, my 3D models are not accurate to to real-world although I do have depth maps in the HEIC images and also in separate TIFs.
2
0
1k
Oct ’22
Updated Object Capure -- needs LiDAR?
I have two apps released -- ReefScan and ReefBuild -- that are based on the WWDC21 sample photogrammetry apps for iOS and MacOS. Those run fine without LiDAR and are used mostly for underwater models where LiDAR does not work at all. It now appears that the updated photogrammetry session requires LiDAR data, and building my app on current xcode results in a non-working app. Has the "old" version of photgrammetry session been broken by this update? It worked very well previously so I would hate to see this regression to needing LiDAR. Most of my users do not have that.
0
0
65
20h