Render to texture or buffer while reading alpha from previous write to same buffer

I'm getting to know SceneKit and exploring it's possibilities.


I'm good with OpenGL, but I'm still learning Metal.

I've written a fairly decent renderer using lower level MTL API but I'd like to use SceneKit now it's grown up in iOS 9.


I've got a good grasp of the scenegraph concept, and I've loaded models from collada and obj files, got good lighting, various nodes doing various animations etc.


Now I'm exploring extending the renderer in SceneKit to more complex tasks. One thing I'm finding it hard to get my head around is how to implement a render pass that can render into a texture that can be applied to other objects in the same render loop.

Is there a recommended way to do this? I'm sure I've missed something blindingly obvious!


Is it only possible be creating a SNCTechnique or are there other, simpler ways to render into a texture that will be used later in a render loop ?

Replies

SCNTechnique is how SceneKit allows multilple pass rendering.

You specify a set of passes. (Either in code or in a plist file)


In some passes can render scenes to buffers - in others you can take the contents of those buffers and combine them.


There is a limited amount of code examples out there. But there are examples similar to what you want to see.

As far as I can tell, SCNTechnique is currently less troublesome in OpenGL.

You may render to a FBO (Frame Buffer Object), then read it back as a texture.

It does work with OpenGL on the Mac.


I don't know whether it's easier or troublesome with SCNTechnique. This article explains it to a large extent:

http://blog.simonrodriguez.fr/articles/26-08-2015_a_few_scntechnique_examples.html

The SCNRenderer class will also allow you to render a SceneKit scene to a MTLTexture. I made an example of this a little while back.


There's a few issues associate with this however, one is a memory leak that I couldn't work around. The other issue was that it affected the physics of my scene, I'd do one render to the offscreen buffer (using SCNRenderer), and another render via the SCNView. This would speed up the physics by a factor of two. It seems the SCNRenderer is a replacement for an onscreen render, not a supplement.


The effect I was going for was to create a glow around certain objects. In the end I rendered these objects as point primatives using the Metal pipeline (no SceneKit), and used the texture that was generated in my scenekit scene. This only worked as the objects (bullets) could be represented as simple points.