Post

Replies

Boosts

Views

Activity

swiftUI code preview fatally crashing.
Hello! Being a beginner in creating and editing views using swiftUI, I am unsure of the current situation that I am presently in, and would be grateful for any advice or suggestions for the below problem. For some context, I have recently begun exploring the "Capturing depth using the LiDAR camera" Documentation using AVFoundation and intend to change its ContentView file which is currently written in swiftUI. The problem: I am unable to preview the SwiftUI code. When I try to resume the Preview Pane on the right
3
1
758
Nov ’23
Acquisition of specific point[pixel] LiDAR depth information
Hello! I have recently begun exploring the "Capturing depth using the LiDAR camera" Documentation using AVFoundation and intend to acquire the depth information of specific points based on touch. I have two main doubts and would be grateful if any clarification can be provided. How and in which format can I access the specific point/pixel information and ensure that it is tracking/displaying the accurate point I acquire from the touch gesture. [basically how are the pixels tagged to their specific data] What is the unit of measure for the LiDAR Depth data? also what is the range for which data accuracy is guaranteed? it would be great if you could push me in the right direction in the search of answers. [I have gone through the documentation in depth] Thanks in advance.
4
0
986
Nov ’23
Usage of Touch Gesture to acquire DepthData of a specific point using LiDAR
Hello! The Aim of my project is as specified in the title and the code I am currently trying to modify uses CVPixelBufferGetBaseAddress to acquire the DepthData using LiDAR. For some context, I made use of the available "Capturing depth using the LiDAR camera" Documentation using AVFoundation and edited code after referring a few Q&A on Developer Forums. I have a few doubts and would be grateful if I could get insights or a push in the right direction. Regarding the LiDAR DepthData: Where is the Origin(0,0) and In what order is it saved? (Basically, how do the row and column correspond to the real world scenario) How do I add a touch gesture to fill in the values of X&Y for "distanceAtXYPoint" so that I can acquire the Depth data on user touch input rather than in real-time. The function for reference : //new function to show the depth data value in meters func depthDataOutput(syncedDepthData: AVCaptureSynchronizedDepthData) { let depthData = syncedDepthData.depthData.converting(toDepthDataType: kCVPixelFormatType_DepthFloat16) let depthMapWidth = CVPixelBufferGetWidthOfPlane(depthData.depthDataMap, 0) let depthMapHeight = CVPixelBufferGetHeightOfPlane(depthData.depthDataMap, 0) CVPixelBufferLockBaseAddress(depthData.depthDataMap, .readOnly) if let rowData = CVPixelBufferGetBaseAddress(depthData.depthDataMap)?.assumingMemoryBound(to: Float16.self) { //need to find a way to get the specific depthpoint (using row data) on touch gesture. //currently use the depth point when row&column = 0 let depthPoint = rowData[0] for y in 0...depthMapHeight-1 { var distancesLine = [Float16]() for x in 0...depthMapWidth-1 { let distanceAtXYPoint = rowData[y * depthMapWidth + x] } } print("⭐️Depth value of (0,0) point in meters: \(depthPoint)") } CVPixelBufferUnlockBaseAddress(depthData.depthDataMap, .readOnly) } The current real-time console log output is as shown below Also a slight concern is that the current output at (0,0) shows a value greater than 1m at times even when the real distance is probably a few cm. Any experience and countermeasures on this also would be greatly helpful. Thanks in advance.
0
0
533
Dec ’23
Using LiDAR DepthData with ARKit and SceneKit
Greetings! I have made use of Apple ARKit documentations to create a simple ARKit application which utilizes SceneKit (Tried Metal too) I am currently unsure of how to make use of SmoothedSceneDepth(SceneDepth) in general to acquire the DepthData from the DataMap acquired in the View. is there any particular method or way that I can access this data for displaying the depth. would be grateful with any inputs or suggestions. Thanks in advance
0
0
601
Dec ’23