You're dealing with depth and color data by frame. If you are using the demo project from the link, you'll just need to process the data on capture image. There are helper functions I've found recently here. Look specifically at UIImage.toByteArrayRGBA(). You'll need to store the results for each frame and then write to a file with the color data. The depth data can be added to it as well.
Post
Replies
Boosts
Views
Activity
I would recommend using ArcGIS runtime sdk for iOS. You can create a full-scale AR map to overlay over the camera (but it can be invisible). Then you can use real world coordinates to pin objects to the 1-to-1 scale map. The way it works is that the user's real world location is transferred to the user's digital world location, then the runtime moves the map and everything pinned to it around so that the user's real world movement is matched in the digital world's map. IMO, This is the best way around not being in the ARGeoTracking areas.
Instead of a Timer in VideoReader.swift, try using CADisplayLink with a selector that points to this method
@objc private func readBuffer(_ sender: CADisplayLink) {
let nextVSync = sender.timestamp + sender.duration
let currentTime = playerItemVideoOutput.itemTime(forHostTime: nextVSync)
if playerItemVideoOutput.hasNewPixelBuffer(forItemTime: currentTime),
let pixelBuffer = playerItemVideoOutput.copyPixelBuffer(forItemTime: currentTime, itemTimeForDisplay: nil),
let f = getClosestFrame(time: currentTime.seconds) {
// Code
}
}
Hi! Thanks for your reply!
Indeed, I discovered TextureResource.DrawableQueue. This is very promising. Currently, my app flow is like this: initiate ARView and register setup with metal device call back to set up the library and call a custom function to set up a custom MTLRenderPipelineState with my shader functions attached pointCloudRenderPipelineDescriptor?.vertexFunction = pointCloudVertexFunction pointCloudRenderPipelineDescriptor?.fragmentFunction = pointCloudFragmentFunction.
On ARView (RealityKit) init, I also copy DrawableQueue to my TextureResource like this textureResource?.replace(withDrawables: self.colorDrawableQueue).
In the main draw function, I am passing a depth MTLTexture to the vertex shader function, and a color MTLTexture to the fragment shader function.
I am not sure how Drawable Queue enables for dynamic rendering. If I'm updating the textureResource only once on init (ie. textureResource?.replace(withDrawables: self.colorDrawableQueue) how does the texture update dynamically? Don't I need to replace the texture every frame?
Any detailed or advanced advice on DrawableQueue and dynamic texture updating would be great! BTW, I am familiar with Metal APIs. I was stumped for a while because I needed a replacement for MTKView.currentDrawable. It seems like TextureResource.DrawableQueue is the appropriate replacement. I'm just looking for some low-level examples or documentation or advice. Thanks a mill!
Also, is is realistic to have more than one drawable queue in one render pass? Because I need to call TextureResource.Drawable.present() just after I call
renderEncoder.endEncoding()
commandBuffer.commit()
commandBuffer.waitUntilCompleted()
drawable.present()
[secondDrawable.present()] ???
[thirdDrawabe.present()] ???
Thanks! However that is unfortunate news because I need to mimic creating the MTLTextureDescriptor like this
let depthTextureDescriptor = MTLTextureDescriptor(
depthTextureDescriptor.width = Int(drawableSize.width)
depthTextureDescriptor.height = Int(drawableSize.height)
depthTextureDescriptor.pixelFormat = .depth32Float
depthTextureDescriptor.usage = [.renderTarget, .shaderWrite]
and then create a depthTestTexture
let depthTestTexture = renderer.device.makeTexture(descriptor: depthTextureDescriptor)
then set it to the renderPassDescriptor:
renderPassDescriptor.depthAttachment.texture = depthTestTexture.
I've tried converting that to RealityKit, however, I get stuck on there not being any depth pixel formats.
On another note: Do you happen to know if SceneKit and SCNShadable/SCNProgram allow one to set the depth pixel format?