Metal's MTLBlitCommandEncoder fails with supposedly valid MTLBuffer

I am attempting to use MTLBlitCommandEncoder to copy data from a MTLBuffer into a MTLTexture under Mac OSX 10.12.5.


I am fully aware of a 16384-pixel wide limit on MTLTexture objects. This limit is apparently extended to 32768 only for NVIDIA Pascal-architecture GPU's.


However, according to the docs, there is a limit on the sourceBytesPerRow of a MTLBuffer (ie. not a MTLTexture) which "must be less than or equal to 32,767 multiplied by the destination texture’s pixel size."... so 32767*4 = 131068 bytes or less.


In my case, I am well within this limit. I am trying to copy buffer data that is 24576 pixels wide (in RGBA32 format) -- so 98304 bytes per row.


I get a console message:

ValidateCopyFromBuffer:sourceOffset:sourceBytesPerRow:sourceBytesPerImage:sourceSize:toTexture:destinationSlice:destinationLevel:destinationOrigin:options:]:504: failed assertion `sourceBytesPerRow(98304) must be <= (65536).'


What gives with that?

Replies

Hello


I must admit that I don't fully understand what you're trying to achieve there. I see two possibilities:

1) Are you trying to copy data 24576 pixels wide into smaller texture? If so, I understand that you want to copy just a part from it? Like with sourceSize smaller than texture size?

2) Are you trying to copy data 24576 pixels wide into texture that is big enough (on Pascal)? Then I guess you're hitting some blit implementation limit, like usage of 16 bits for sourceBytesPerRow?

Could you describe your texture dimensions and parameters passed to copyFromBuffer?


Regards

Michal

I have a raw image buffer that is 24576x64... it's actually a frame from an AVI file. There are 32 bits/pixel (RGBA).


I can't load that into a MTLTexture directly as it's over the 16384-wide limit on textures ("Resources->Maximum 2D texture width and height" from this table). I don't want to be dependent on an NVIDIA-Pascal GPU being present.


So, my intention is to load the image into a MTLBuffer and then use a MTLBlitCommandEncoder and a set of twelve copyFromBuffer commands to "stripe" the content into a 2048x768 MTLTexture consisting of compact top-to-bottom 2048x64 stripes, all in MTLPixelFormatRGBA8Unorm format. Once it's in this "packed" format, I can use it as a MTLTexture.


The rub is this... according to the docs, my source rows can be up to 32767*4 bytes/pixel = 131068 bytes in length. I'm only hitting 24576*4=98304 bytes, and it's failing.

OK, so this looks like a bug, and if I were you, I'd report it. In meantime, you can get effect you want by preparing your own "big image blitter" so to speak. This can be either compute shader (that will read from MTLBuffer and write to MTLTexture) or a pair of vertex/fragment shaders composed into pipeline rendering to the texture you want to get your data into. A bit more work, but it should do.

Wouldn't it be possible to simply make a texture from that 24576x64 array but with size 2048x768? The order of the data in memory wouldn't change, but the texture would just interpret one row of 24576 pixels as 12 rows of 2048 pixels.

"This limit is apparently extended to 32768 only for NVIDIA Pascal-architecture GPU's."


Where did you see that? There aren't any Macs that have an nVidia Pascal GPU. nVidia sells a Titan eGPU that is not supported by Apple. If you're using that, you should ask nVidia.

Yes, i was referring to beta nVidia Pascal drivers released back in April 2017 for legacy MacPro5,1 and earlier machines. Apparently, use of 32768-wide textures is possible, though not officially supported by Apple.


I hope that clears up my ancillary comment -- which is in no way related to either an error in the Apple Docs or to a bug present in the Metal API. Bug report #33128631.