I have created a 3D model of a local outdoor performance space and I have an app that uses
Metal on MacOS 10.14.2 to display the model. I want to create an animation
by flying the camera around the scene, while record each frame. I know how to do
the animated fly-around and I know how to create a video frame-by-frame with
AVFoundation. The step for which I can find no information is how I can capture each frame.
I have a completion handler so I know when the gpu has finished each command buffer.
But what is the best way to get the image in space?
I thought perhaps I could do this by attaching a second texture to colorAttachments[1]
but this has resulted in some odd behavior where the original scene that used to fill my
MTKView window now occupies just the upper left quadrant of the window.
What I was trying to do is write the same color to both colorAttachments[0] (my screen) and to
colorAttachments[1]. In my shader I defined:
struct FragmentOut {
float4 color0 [[ color(0) ]];
float4 color1 [[ color(1) ]];
};
My fragment shader looks like:
fragment FragmentOut grove_fragment_function(VertexOut vIn [[ stage_in ]],
constant Uniforms &uniforms [[buffer(1)]]) {
....
float4 color = diffuseColor + ambientColor + specularColor;
out.color0 = color;
out.color1 = color;
return out;
}
My hope was that I could then use something like:
offScreenTextureBuffer?.getBytes(buffer, bytesPerRow: 4 * w, from: MTLRegionMake2D(0, 0, w, h), mipmapLevel: 0)
to transfer the image data to a local buffer.
This doesn't seem to work, plus I have the unexpected display behavior noted above.
I am configuring the offScreenTextureBuffer thus:
let pixelFormat = MTLPixelFormat.bgra8Unorm_srgb
var offScreenBufferDescriptor = MTLTextureDescriptor.texture2DDescriptor(pixelFormat: pixelFormat,
width: 1000,
height: 600,
mipmapped: false)
offScreenBufferDescriptor.usage = [.renderTarget, .shaderRead]
offScreenTextureBuffer = device.makeTexture(descriptor: offScreenBufferDescriptor)