filter::linear in compute function / kernel broken on iOS (and half-broken on OS X)?

On OS X, in a compute function, (up)sampling a texture with coords::pixels and filter::linear will produce an unexpected result that looks like filter::nearest but shifted down and to the right half a (src)pixel. Using coords::normalized instead of coords::pixels produce the expected bilinearly interpolated results though.


On iOS, filter::linear looks exactly like filter::nearest no matter if I use coords::pixels or coords::normalized, so it seems like I have to use a fragment shader in order to sample a texture with a working filter::linear on iOS ...


I have filed #25274449 but I would like to know if anyone else has noticed this, since it seems strange that a bug like this shouldn't have been detected and fixed long ago.

Replies

Small Xcode project here to demonstrate the bug on OS X, should anyone want to have a look:

http://www.sloppyfocus.com/CmdLineMetalTest.zip


I would appreciate it if someone spotted a stupid mistake and it turned out that Metal compute functions / kernels are not broken.


(

Tested on:

Macbook Pro, Retina, 15-inch, Late 2013, Intel Iris Pro 1536 MB, OS X 10.11.4

iPhone 6 Plus, iOS 9.3

)

Hi Jens! Just wondering, how did your bug report #25274449 turn out? Did you ever get a meaningful response?

Still broken. Don't need to use a fragment shader, you can just implement your own bilinear filtering logic.

Hey I'm desperately trying to implement a 2x bilinear upscaling filter, but seem to fail due to your mentioned bugs. Could you share the filtering logic that works for you?

Hi,


I have experienced same effect on MacOS with xcode 9 beta. With pixel based coordinates and linear filtering I get no filtering in compute shader. (no filtering means nearest filtering).

I have to revert my statement. I have used the sampler for averaging downscale by 2 and I have offsetted the pixel coordinates by 0.5. Without this offset the result looks fine.