Hello,
I'm exporting a 3D model with a shader created in Reality Composer Pro's Shader Graph to a USDZ file for viewing in QuickLook.
Once exported, the USDZ file in QuickLook appears without the shader material.
When I import the file back into Xcode, the material renders properly.
Is it possible to publish MaterialX shaders to be viewed in QuickLook?
Hopefully it is, because it seems the push to MaterialX and USDZ is supposed to be for universal support :)
Any guidance is appreciated!
General
RSS for tagDelve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Post
Replies
Boosts
Views
Activity
Hi,
I'm trying to build an app demo with Metal framework on visionOS.
I've found out that when I enabled immersive space it was in a fully immersive mode and the Metal was rendering the background with black color. And I've inspected the demo code and the clear color was (0, 0, 0, 0).
I wonder if it is possible to render just like RealityKit with mixed reality effect that some 3D models can be rendered in the real world without a black background. Any idea or explanation?
I'm using an anaconda environment
Tensorflow-macos 2.15
Keras 2.15
Python 3.11.5
macOS m2 14.1
I guess problem with Pycharm, because cod is working and error is: Cannot find reference 'keras' in 'imported module tensorflow | init.py'.
Previously I built a model on a simple MNIST and it's working but have same problem.
I have tried different references and versions of python. I've changed environments at least 3 times and it doesn't work.
Hi,
when compiling shaders, metal command line tool has more options than MTLDevice::newLibraryWithSource().
For instance, "man metal" mentions 10 levels of optimization (-O0, -O1, -O2, -O3, -Ofast, -Os ...) while MTLCompileOptions doc only shows 2 levels (Default, Size).
Is there a way to pass -O2 as optimization level to MTLDevice::newLibraryWithSource()?
Thanks
Good afternoon, I had a developer account and for years I developed several gaming applications for Apple.
And a few years ago I went to look at my old developer email and there was a confirmation of Apple's payment debit to me for the profit I earned on my apps. Could you check if Apple left any payments pending on my old developer account?
I would like you to give me an answer because even my old games were removed from the Apple store.
Thank you!
I've ended up on here following the instructions from the installer for the community extender of AGPT, hoping to provide it the DMG for the official apple GPTK but when I followed the link in the installer, it wasn't there. I searched "GPTK and Game Porting Toolkit" to no avail. I'm running Sonoma on a 2023 M2 pro MBP with 32gb RAM, plenty powerful and up to date enough for me to have a genuine reason to use this tool. what gives? was it taken offline? if so, why?
USDZ is not getting the job done for AR on ios. Are there plans to utilize WebXR in future versions of ios for phones and ipads, so that developers might leverage all the capabilities that the GLB file format provides? By embracinng WebXR, Android is providing a much better environment to build AR experiences.
As content creators, we would like to support all of our users with a common code stack and workflow. Thanks for any insights.
I am trying to pass array data in Uniform from Swift to Metal's fragment shader. I am able to pass normal Float numbers that are not arrays with no problem. The structure is as follows
struct Uniforms {
var test: [Float]
}
The values are as follows
let floatArray: [Float] = [0.5]
As usual, we are going to write and pass the following As mentioned above, normal Float values can be passed without any problem.
commandEncoder.setFragmentBytes(&uniforms, length: MemoryLayout<Uniforms>.stride, index: 0)
The shader side should be as follows
// uniform
struct Uniforms {
float test[1];
};
Fragment Shader
// in fragment shader
float testColor = 1.0;
// for statement
for (int i = 0; i < 1; i++) {
testColor *= uniforms.test[i];
}
float a = 1.0 - testColor;
return float4(1.0,0.0,0.0,a);
I thought that 0.5 in the array was passed, but no value is passed.
I think I am writing something wrong, but how should I write it?
The error is:
"Error Domain=MTLLibraryErrorDomain Code=3 ":1:10: fatal error: cannot open file './metal_types': Operation not permitted
#include "metal_types""
On my Mac mini (with intel chip), I run flutter application in VScode lldb debugger and got this error, flutter application cannot draw its UI and shows a blank white window.
My Xcode version is latest version 15.2.
Flutter application can run normally in Mac mini M1 in VSCode lldb debugger, and can run normally without debugger in Mac mini Intel chip.
In Metal framework and Core Graphic framework location, there is no file named "metal_types".
Before, it didn't happen. I could run normal in vscode lldb debugger on Mac mini intel chip and M1.
Anyone knows anythings, please comments.
Thank you!
Hello All -
I'm receiving .usdz files from a client. When previewing the .usdz file in Reality Converter - the materials show up as expected. But when I load the .usdz in Reality Composer Pro, all the materials show up as grey. I've attached an image of the errors I'm get inside Reality Converter, to help trouble shoot.
What steps can I take to get these materials working in my Reality Composer Pro project? Thanks!
I have two pictures, one render only in left screen and one render only in right screen. What should I do . in vision pro
device.supportsRaytracing can be used to check if the device supports Raytracing. It seems most of the devices support Raytracing. However, only M3 and iPhone 15 Pro supports hardware Raytracing. How do I check programmatically if a device supports hardware Raytracing? Thank you.
Hi,
I have a MacBook Pro with an Intel Iris Plus Graphics 640, which gives maximum 4.1 OpenGL context. Thus, I would like to use Mesa as an alternative for getting higher contexes e.g. 4.5 or 4.6. I don’t care about performance in this phase.
I have built Mesa with LLVM (llvm-config… etc…) but even by placing the built libGL.dylib in the same folder with executable or setting DYLD_LIBRARY_PATH or DYLD_INSERT_LIBRARIES, I am not able to get the Mesa renderer.
Does Apple support this overriding of libGL with a custom one?
Any help/guidance would be really appreciated.
Thanks
Hi,
i am required to upload my CFD simulation results to the new vision pro glasses.
This simulation shall be visible as a soft VR/AR object in the room.
I am very new to the developer world. Could someone give me a hint which IDE, tool etc. to use for this task?
SwiftUI, swift, visionOS, Xcode, ... ????
After I know what IDE/tool/language to use, I will start learning courses with it.
Thanks a lot!!
As I understood, the ability to use metal is only in "vr" mode. But is there any way to make a custom post-process in mixed reality mode?
I have build the plugins and added to project but when I run game I am getting these errors
https://github.com/apple/unityplugins
EntryPointNotFoundException: AppleCore_GetRuntimeEnvironment assembly: type: member:(null)
Apple.Core.Availability.OnApplicationStart () (at Library/PackageCache/com.apple.unityplugin.core@d07a9d20344c/Runtime/Availability.cs:40)
EntryPointNotFoundException: GKLocalPlayer_GetLocal assembly: type: member:(null)
I’m really sorry if this is not the proper place for this. I’m developing a game with online mode what works just fine in WiFi mode but not in mobile network. If someone has two Apple devices and can try if with the mobile network the multiplayer mode works I will appreciate A LOT https://apps.apple.com/es/app/ufo-snowboard/id6474542185
I captured my office using 3D Scanner and get a USDZ file.
The file contains a 3-D Model and a Physically based material.
I can view the file correctly with texture on Xcode and Reality Composer Pro.
But when using RealityView to present the model in immersive space. I got the model in whole black.
My guess is my Material doesn't have a shader graph?
Does anyone caught into similar issue? How to solve it?
Hi,
I have a MBP 2023 M3 Max 64GB with 16 Core CPU ( 4 efficiency, 12 Performance) and 40C GPU.
I've got a Game (Cities Skylines 2) successfully working using Whisky. However, only 4 Cores are reported to the game which leads to a situation where there are many calculations batched up while at the same time my CPU performance cores almost idle, while the efficiency cores are well utilized.
I suspect this is because the game only sees 4 cores and has some logic to batch the calculations differently depending on how much cores are available.
Is there a way to override how many cores the game sees? e.g. by using an environment variable or something?
Thanks,
Dominik
Hi there, I have some existing metal rendering / shader views that I would like to use to present stereoscopic content on the Vision Pro. Is there a metal shader function / variable that lets me know which eye we're currently rendering to inside my shader? Something like Unity's unity_StereoEyeIndex? I know RealityKit has GeometrySwitchCameraIndex, so I want something similar (but outside of a RealityKit context).
Many thanks,
Rich