Hi! I'm trying to implement hard shadows on a custom Metal rendering engine. Here's what I'm currently doing:
- First, an initial render pass to render the model from the sun's point of view and create a depth2D texture of the points of the model viewed from the sun's point of view (using an orthographic projection).
- Secondly, another render pass, where for each point on the fragment shader I compute the equivalent point on the sun's frame of view, convert the coordinates to Metal's texture coordinate system, and sample the depth2D texture generated on the first pass using
sample_compare
to check if it's occluded or not in the sun's frame of reference. If it is occluded, I reduce the intensity of the output color, creating the shadow.
The problem: It seems like the texture is not being sampled properly. I've tried outputting the result of the sample_compare
operation directly as the fragment color and it seems like it always returns 0
. The final image is pure black.
The weird part: If I try to debug it using Xcode's GPU frame capture, the image is NOT black, and the shadows are casted properly. Here's a test on a single sphere, showing white if the fragment is lit and black if it's unlit:
And another test rendering a real model with full lighting. Note how on the actual view rendered on the app, all pixels in the fragment are seen as 'occluded' from the sun, so the whole model is in the shadows.
Why is the result of sample_compare
non-zero on the GPU Frame capture but not on the actual image displayed?
Here's the relevant code of my second fragment (the one rendering the final image):
constexpr sampler shadowSampler(coord::normalized,
filter::linear,
mip_filter::none,
address::clamp_to_border,
border_color::opaque_white,
compare_func::less_equal);
float shadow_sample = shadowMap.sample_compare(shadowSampler,
sphereShadowTextureCoord.xy,
sphereShadowTextureCoord.z);
// According to the debugger, shadow_sample returns a value greater than 0 for the lit pixels and 0 otherwise.
// Not sure it this is expected (I thought it would return 1.0 if lit, 0.0 if unlit).
float is_sunlit = 0;
if (shadow_sample > 0) {
is_sunlit = 1;
}
// Output color
output.color = half4(shadedColor.r - 0.3 * (1 - is_sunlit),
shadedColor.g - 0.3 * (1 - is_sunlit),
shadedColor.b - 0.3 * (1 - is_sunlit),
1.0);
Advice on what could be causing the problem (or how to debug it, since the debugging tools are not returning the expected results) would be appreciated. Thanks!
Fixed it! I was writing colorAttachments[1].storeAction = .store
where I should have written depthAttachment.storeAction = .store
. So the depth attachment was on the default mode for depth attachments, .dontCare
. It worked on the GPU frame capture because it stores all textures, but not on the real app because it was being discarded before the next render pass.