I came to the forum because I have been having the exact same problem. My code is in Objective-C, but I have been trying to use the equivalent method on MTKTextureLoader. I went down the same path as you with . . . odd results.
In my code, I have an array of texture file names that correspond to the texture names in Xcode's image set. I loop through them passing each texture name to [textureLoader newTextureWithName:imageName:scaleFactor:1.0: bundle:nil:options:loaderOptions:error:&error];.
For some of the textures, the code works as expected. For others, I receive the "Image decoding failed" error in my log. I thought there might be an error in the texture name conventions, but I set up a test for that to ensure everything matches. I was also able to load the images that do not load with the MTKTextureLoader using a different framework.
I suspect Xcode's asset catalog is having trouble interfacing with Metal. I was hoping to find a work around, but for the time being, I am loading the textures outside of the asset catalog like so:
NSURL* img = [[NSBundle mainBundle]URLForResource:imageName withExtension:@"png"];
id<MTLTexture> texture = [textureLoader newTextureWithContentsOfURL:img options:nil error:&error];
This approach is working fine for now, but I would love to use the asset catalog instead.
Post
Replies
Boosts
Views
Activity
By any chance is your project written in Swift? I was receiving the same error code when trying to use frame capture followed immediately by a crash. I could usually reopen Xcode and see some of the frame capture, but mostly nothing useful.
I switched the code to Objective-C and frame capture started working again so ..... yeah :\
Which API are you trying to use? Metal? There's a great WWDC on Metal basics that will show you how to render a triangle to the screen if that's all you are trying to do. You can add your particular coordinates to a metal buffer. You can do it without an index buffer for a simple shape like a triangle.
After further testing it seems the commands are being encoded. I can get expected results running simple calculations and reading them back on the cpu. The issue appears to be that the metal frame capture tool doesn’t play well with indirect command buffers leading me to believe the buffer was empty when in fact it was not.
Thanks for the suggestion. I will do so this weekend.
Thanks for the help. Would you mind clarifying what you mean by "dynamic" indexing or pointing me to a resource that might help me understand that distinction better?
Bumping this question now that Apple has announced an M1 Chip in the 2021 iPad Pro. I cannot find any documentation that says the new iPad Pro can run MacOS apps such as Xcode but the device should easily have the power to do so. And with up to 2TB storage, the iPad Pro might be more powerful than the new iMac.
Can anyone comment on this now? If the iPad Pro can run MacOS apps in the same way the MacBook Pro can run iOS apps, the new iPad is a totally insane game changer.