On OS X, in a compute function, (up)sampling a texture with coords::pixels and filter::linear will produce an unexpected result that looks like filter::nearest but shifted down and to the right half a (src)pixel. Using coords::normalized instead of coords::pixels produce the expected bilinearly interpolated results though.
On iOS, filter::linear looks exactly like filter::nearest no matter if I use coords::pixels or coords::normalized, so it seems like I have to use a fragment shader in order to sample a texture with a working filter::linear on iOS ...
I have filed #25274449 but I would like to know if anyone else has noticed this, since it seems strange that a bug like this shouldn't have been detected and fixed long ago.