Post

Replies

Boosts

Views

Activity

Reply to Accessing and exporting LiDAR point cloud
I am also trying to use the depthmap data, but in a very simple way. I am reading just one horizontal line (portrait orientation) from the Lidar (which is 192 pixels/distances, so to say). The data is somehow unexpected. When the iPad is parallel to a flat wall, say in a distance of 0.3 meter, I would expect to read that 0,3 meter only for the datapoint in the middle of the data line (the one exactly opposite to the camera.) Datapoints at the very left and right ([0] and [191]) should show approx. 0,33 meter. But they do not. All data of that line is 0,3 meter. Once I scan other objects than a wall the data make more sense but still the data seem to be squeezed (a circle column is shown like an ellipse...). Is there an explanation for the data? Or is there a way to get readings that comply to Pythagoras-triangle-theory in that parallel-to-wall experiment? Any insight would be highly appreciated.
Mar ’21