How can I get depth data (distance) from Ipad Pro 12.9 2020?

Hi Everyone,
I am working on iPad Pro 12.9 2020 and I want to get depth data (distance) from Ipad Pro.
From ARFrame, I can get depth buffer (CVPixelBuffer with DepthFloat32) with "scenedepth.depthMap" but I can't get depth data (distance). Take a walk around the internet, I see that the Apple doesn't allow the developer access to depth data (distance), Is that right?.
Could you help me some above information. And I appreciate if anyone has methods to get depth data (distance).
Thank you very much.

Answered by DTS Engineer in 670619022
Hello,


From ARFrame, I can get depth buffer (CVPixelBuffer with DepthFloat32) with "scenedepth.depthMap" but I can't get depth data (distance). Take a walk around the internet, I see that the Apple doesn't allow the developer access to depth data (distance), Is that right?.

Sorry, I'm not really following, each value in the sceneDepth.depthMap contains depth data (distance in meters from the rear camera). Is that somehow different from what you describe as "depth data (distance)"?


Accepted Answer
Hello,


From ARFrame, I can get depth buffer (CVPixelBuffer with DepthFloat32) with "scenedepth.depthMap" but I can't get depth data (distance). Take a walk around the internet, I see that the Apple doesn't allow the developer access to depth data (distance), Is that right?.

Sorry, I'm not really following, each value in the sceneDepth.depthMap contains depth data (distance in meters from the rear camera). Is that somehow different from what you describe as "depth data (distance)"?


I'm assuming you want to read the depth data for a certain point within the texture on the CPU?

It's a bit more straightforward in metal code, but here is a quick extension that might help you

Code Block swift
import CoreVideo
import Metal
import simd
extension CVPixelBuffer {
// Requires CVPixelBufferLockBaseAddress(_:_:) first
var data: UnsafeRawBufferPointer? {
let size = CVPixelBufferGetDataSize(self)
return .init(start: CVPixelBufferGetBaseAddress(self), count: size)
}
var pixelSize: simd_int2 {
simd_int2(Int32(width), Int32(height))
}
var width: Int {
CVPixelBufferGetWidth(self)
}
var height: Int {
CVPixelBufferGetHeight(self)
}
func sample(location: simd_float2) -> simd_float4? {
let pixelSize = self.pixelSize
guard pixelSize.x > 0 && pixelSize.y > 0 else { return nil }
guard CVPixelBufferLockBaseAddress(self, .readOnly) == noErr else { return nil }
guard let data = data else { return nil }
defer { CVPixelBufferUnlockBaseAddress(self, .readOnly) }
let pix = location * simd_float2(pixelSize)
let clamped = clamp(simd_int2(pix), min: .zero, max: pixelSize &- simd_int2(1,1))
let bytesPerRow = CVPixelBufferGetBytesPerRow(self)
let row = Int(clamped.y)
let column = Int(clamped.x)
let rowPtr = data.baseAddress! + row * bytesPerRow
switch CVPixelBufferGetPixelFormatType(self) {
case kCVPixelFormatType_DepthFloat32:
// Bind the row to the right type
let typed = rowPtr.assumingMemoryBound(to: Float.self)
return .init(typed[column], 0, 0, 0)
case kCVPixelFormatType_32BGRA:
// Bind the row to the right type
let typed = rowPtr.assumingMemoryBound(to: UInt8.self)
return .init(Float(typed[column]) / Float(UInt8.max), 0, 0, 0)
default:
return nil
}
}
}


Thanks @darknoon for your help.

"Sorry, I'm not really following, each value in the sceneDepth.depthMap contains depth data (distance in meters from the rear camera). Is that somehow different from what you describe as "depth data (distance)"?"
  • --> Thank for your answer, I am new in swift iOS, so it loss many time for get value from CVpixelbuffer (by pointer)

  • --> I have two question, can you help me for issue

1/ I tried for save CVPixelBuffer to image 16 bit depth. It's meaning, I have to convert 32 bit value of CVpixelbuffer for image 16 bit but I can't find any function for that (I have to use python for convert)
I used CVpixelbuffer --> CIImage --> UIImage (1) --> Data (pngData) --> UIImage(2) --> UIImagewritetoPhotosAlbum
At (1), I don't no why i can't save to image because I think (1) and (**2) is the same type of data
2/ The depth value is short, 192x256 (49152) point. When i convert depth to point cloud (it very very hard to see) but when i combine with rgb color (image from 192x256 to 8 bit), it look like OKE. The question, "depth" is a combination of depth data and Rgb data, it's right
Thank you very much.
How can I get depth data (distance) from Ipad Pro 12.9 2020?
 
 
Q