Using the camera capture as a texture

In the WWDC Introcucing ARKit video it shows how to use sceneview.snapshot() on a material.

I would like to be able to use the live video feed, before anything is rendered on top of it, as a texture. Is this going to be possible through Scenekit? presumably I need to be looking somewhere different.

Replies

Has nobody got any suggestions? It seems possible with Metal but is there simply no way to get at the camera data before anything else is rendered on top of it via Scenekit?

Yes, use capturedImage (sceneView.session.currentFrame?.capturedImage) instead of snapshot

Thankyou. I'm new to scenekit and am currently using pre prepared scene files for everything. It's about time to learn how to build my own geometry and assign textures to it. The only reference I've seen for this is the demo in the WWDC ARKit video where sceneview.snapshot is copied to textures. What I want to do is use the moving camera image from currentframe.capturedimage AS a texture in the same way a Metal based ARKit app does before rendering the rest of the content. Is this possible? I'm sure the aproach used of copying to another texture would work but it seems like an unnecassary, potentialy quite expensive overhead. Is it possible to set it up as a texture in this way?


My goal here is to give the effect of the ground plane warping into a displaced heightfield. This means I'll also need to write a bit of special code to generate UV coordinates (I think I have the math worked out) which is presumably going to mean looking into how Scenekit handles vertex shaders.

When I tried this, using very similar code to the WWDC presentation using sceneview.snapshot(), all I got was a white texture. This is of course because currentFrame.capturedImage is actually two textures in YUV format.


This led me to look into how a metal based ARKit app renders the background, which is via a custom shader that does the format conversion straight to the screen. This is well documented here: https://developer.apple.com/documentation/arkit/displaying_an_ar_experience_with_metal but I presumably have to do the same thing into an RGB texture.


It seems this will involve a mix of scenekit and metal. I have several questions about this. Firstly, can I get access to the metalDevice that is running under the hood in ARKit? and is that even the right question to be asking?


It also occurs to me that there might be a simpler way. Is there some function, possibly built into the AV APIs, that will do the conversion for me?