Hi,
I have a CUDA program that I want to convert to Metal Compute so that we can support Apple hardware.
When I wrote the CUDA version, I was able to write efficient code because I learned first about the Cuda-core architecture. The way the cores can access memory for instance is very important information so that I could write code that efficiently access the memory.
Now I want to do the same for the Metal Compute software. But I can not find any information about the low level architecture and especially the things you should know to be able to write efficient code.
Do I miss something?
Is there some guide giving hints for the most efficient way to access memory for instance?
General
RSS for tagDelve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Post
Replies
Boosts
Views
Activity
I am working on an application where we are planning to use Metal for directly rendering custom content. When user looks at something on the rendered image, I want to get the position or ray of cursor (the point where the user is currently looking at) to render something else like a crosshair. Is it possible to get the cursor position information on VisionOS to accomplish this? How can I know if something is being hovered on by the eyes?
Hello,
Documentation says CGDisplayCreateImage() is deprecated.
Are there any equivalent which can be used instead of CGDisplayCreateImage()? (any function which implements the same functionality)
Thank you for the help,
Pavel
Platform 'METAL' is experimental and not all JAX functionality may be correctly supported!
2024-03-23 22:04:38.947506: W pjrt_plugin/src/mps_client.cc:563] WARNING: JAX Apple GPU support is experimental and not all JAX functionality is correctly supported!
Metal device set to: Apple M1 Pro
systemMemory: 16.00 GB
maxCacheSize: 5.33 GB
loc("-":0:0): error: current mps dialect version is 1.0.0, can't parse version 1.1.0
/AppleInternal/Library/BuildRoots/495c257e-668e-11ee-93ce-926038f30c31/Library/Caches/com.apple.xbs/Sources/MetalPerformanceShadersGraph/mpsgraph/MetalPerformanceShadersGraph/Core/Files/MPSGraphExecutable.mm:1097: failed assertion `Error importing MLIR bytecode.
'
zsh: abort python -c 'import jax; print(jax.numpy.arange(10))'
I tried to render to two layers using vertex amplification in my mesh shader program, but in the vision pro only the left eye has content and it contains 2 eyes image, and when I switch the mapping0.renderTargetArrayIndexOffset in the encoder, it does not transfer the image to the right eye. Using vertex amplification achieve 2 eyes rendering?
I am in Unity 2022.3.21f1 using the Apple plugins for Unity with the following versions:
Apple.Core - 3.1.0
Apple.Accessibility - 1.1.0
Apple.GameController - 1.2.0
Apple.GameKit - 2.2.0
I am on MacOS 14.4 Apple Silicon and Xcode 15.3.
I started working in a file that I haven't worked on in a while, and found that I was getting errors in Unity with the Apple.Accessibility plugin, so I updated the Apple plugins and stopped getting the errors. However, when I went to build my project (which is just for iOS), I now get the following error for each of the four plugins I have installed:
Please ensure that the build invocation (build.py, xcodebuild, or Xcode) compiled cleanly and that the build was configured to support Release on iOS.
UnityEngine.Debug:LogError (object)
Apple.Core.AppleNativeLibraryUtility:ProcessWrapperLibrary (string,UnityEditor.BuildTarget,string,UnityEditor.iOS.Xcode.PBXProject) (at ./Library/PackageCache/com.apple.unityplugin.core@ba71bdbec187/Editor/ApplePlugInEnvironment.cs:604)
Apple.GameController.Editor.AppleGameControllerBuildStep:OnProcessFrameworks (Apple.Core.AppleBuildProfile,UnityEditor.BuildTarget,string,UnityEditor.iOS.Xcode.PBXProject) (at ./Library/PackageCache/com.apple.unityplugin.gamecontroller@4ec66225948e/Editor/AppleGameControllerBuildStep.cs:61)
Apple.Core.AppleBuild:OnPostProcessBuild (UnityEditor.BuildTarget,string) (at ./Library/PackageCache/com.apple.unityplugin.core@ba71bdbec187/Editor/AppleBuild.cs:195)
UnityEditor.EditorApplication:Internal_CallGlobalEventHandler () (at /Users/bokken/build/output/unity/unity/Editor/Mono/EditorApplication.cs:493)
I downloaded these plugins from Github and built with the build.py script, and had no errors in doing so. I've tried rebuilding multiple times and even specifying the platform as Release (although the default is all so it should have built Release anyways). I've tried rolling back to previous versions of the plugins as well with no luck so far. I don't remember which exact versions I used to be on but have had no luck with the approximate ones.
Does anyone know how I can point Unity to the NativeRelease folders? I've checked that the frameworks for my libraries are there (i.e. at ../Library/PackageCache/com.apple.unityplugin.core@287366a1eaa5/NativeLibraries~/Release/iOS/AppleCoreNative.framework)
We have built the game on Unreal engine 4 and we have optimised the game to run on tvOS devices newer than 2017 (viz. Apple TV 4k and above). We could not bring it down to support Apple TV HD (2015) due to its visual and memory requirements. Is there a way to exclude Apple TV HD from support list. We couldnt find any required device capability to add to info.plist (eg: iphone-ipad-minimum-performance-a12, we tried it but this does not work for tvOS build).
I've been trying to find a C/C++ framework for apps on macos. I couldn't find good docs on metal. Is there a way to write C++ apps without any other library?
I've got a couple 2D PNG assets that I want to add to a scene made of a couple other udsz files in RCP (picture adding a couple 2D videogame characters to a simple 3D diorama).
When I try to drag the PNGs to the workspace or the file tree…nothing happens.
I found a walkthrough on Medium (called "Importing and Exporting Personalized Objects for Augmented Reality: Reality Composer and SwiftUI" for those curious as I can't link to Medium posts here) that makes it look like users could do this with simple drag-and-drop. The Medium post is from June 2023, and in the screenshots RCP visually looks a lot more like Reality Composer on iPad, so I'm assuming it's changed a lot since then?
Is there still a way to do this? I've tried adding the 2D elements to a scene with Blenders "import images as planes," but I'm getting weird halos around them and was hoping RCP could make the process a bit easier/cleaner.
apple/apple/game-porting-toolkit 1.1 did not build
Logs:
/Users/jorge/Library/Logs/Homebrew/game-porting-toolkit/00.options.out
/Users/jorge/Library/Logs/Homebrew/game-porting-toolkit/01.configure
/Users/jorge/Library/Logs/Homebrew/game-porting-toolkit/01.configure.cc
/Users/jorge/Library/Logs/Homebrew/game-porting-toolkit/02.make
/Users/jorge/Library/Logs/Homebrew/game-porting-toolkit/wine64-build
Hi,
Here I encountered an issue while building game-porting-toolkit 1.1.
Below are the prints:
===================================================
2 warnings and 7 errors generated.
make: *** [dlls/crypt32/unixlib.o] Error 1
make: *** Waiting for unfinished jobs....
==> Formula
Tap: apple/apple
Path: /usr/local/Homebrew/Library/Taps/apple/homebrew-apple/Formula/game-porting-toolkit.rb
==> Configuration
HOMEBREW_VERSION: 4.2.12
ORIGIN: https://github.com/Homebrew/brew
HEAD: 780fbbc65e90fbe09629aba180a1839e9e7dbaf2
Last commit: 6 days ago
Core tap JSON: 17 Mar 14:17 UTC
Core cask tap JSON: 17 Mar 10:58 UTC
HOMEBREW_PREFIX: /usr/local
HOMEBREW_CASK_OPTS: []
HOMEBREW_MAKE_JOBS: 10
Homebrew Ruby: 3.1.4 => /usr/local/Homebrew/Library/Homebrew/vendor/portable-ruby/3.1.4/bin/ruby
CPU: 10-core 64-bit westmere
Clang: 15.0.0 build 1500
Git: 2.39.3 => /Applications/Xcode.app/Contents/Developer/usr/bin/git
Curl: 8.4.0 => /usr/bin/curl
macOS: 14.4-x86_64
CLT: 15.3.0.0.1.1708646388
Xcode: 15.3 => /Applications/XCode.app/Contents/Developer
Rosetta 2: true
==> ENV
HOMEBREW_CC: clang
HOMEBREW_CXX: clang++
Error: apple/apple/game-porting-toolkit 1.1 did not build
Logs:
/Users/mfhyy/Library/Logs/Homebrew/game-porting-toolkit/00.options.out
/Users/mfhyy/Library/Logs/Homebrew/game-porting-toolkit/01.configure
/Users/mfhyy/Library/Logs/Homebrew/game-porting-toolkit/01.configure.cc
/Users/mfhyy/Library/Logs/Homebrew/game-porting-toolkit/02.make
/Users/mfhyy/Library/Logs/Homebrew/game-porting-toolkit/wine64-build
If reporting this issue please do so to (not Homebrew/brew or Homebrew/homebrew-core):
apple/apple
Currently (jax-metal 0.0.5) doesn't support matrix decompositions like Cholesky, LU, or Eigen, on Metal. I have a 64GB M1 Max on Sonoma 14.2.1 (23C71). Eigen raises a NotImplemented error, but Cholesky and jax.linalg.inv error out. When using the CPU by setting
export JAX_PLATFORM_NAME=CPU
then the decompositions work. The error I am getting:
XlaRuntimeError Traceback (most recent call last) Cell In[33], line 1 ----> 1 jsl.cho_factor(Sigma_0_inv)
File ~/miniconda3/envs/jaxmetal/lib/python3.10/site-packages/jax/_src/scipy/linalg.py:61, in cho_factor(failed resolving arguments) 56 @_wraps(scipy.linalg.cho_factor, 57 lax_description=_no_overwrite_and_chkfinite_doc, skip_params=('overwrite_a', 'check_finite')) 58 def cho_factor(a: ArrayLike, lower: bool = False, overwrite_a: bool = False, 59 check_finite: bool = True) -> tuple[Array, bool]: 60 del overwrite_a, check_finite # Unused ---> 61 return (cholesky(a, lower=lower), lower)
File ~/miniconda3/envs/jaxmetal/lib/python3.10/site-packages/jax/_src/scipy/linalg.py:54, in cholesky(failed resolving arguments) 49 @_wraps(scipy.linalg.cholesky, 50 lax_description=_no_overwrite_and_chkfinite_doc, skip_params=('overwrite_a', 'check_finite')) 51 def cholesky(a: ArrayLike, lower: bool = False, overwrite_a: bool = False, 52 check_finite: bool = True) -> Array: 53 del overwrite_a, check_finite # Unused ---> 54 return _cholesky(a, lower)
File ~/miniconda3/envs/jaxmetal/lib/python3.10/site-packages/jax/_src/compiler.py:255, in backend_compile(backend, module, options, host_callbacks) 250 return backend.compile(built_c, compile_options=options, 251 host_callbacks=host_callbacks) 252 # Some backends don't have host_callbacks option yet 253 # TODO(sharadmv): remove this fallback when all backends allow compile 254 # to take in host_callbacks --> 255 return backend.compile(built_c, compile_options=options)
XlaRuntimeError: UNKNOWN: /var/folders/9d/0035yr7j3bx84h3ghpp_86pc0000gn/T/ipykernel_40684/3046650730.py:1:0: error: failed to legalize operation 'mhlo.cholesky' /var/folders/9d/0035yr7j3bx84h3ghpp_86pc0000gn/T/ipykernel_40684/3046650730.py:1:0: note: see current operation: %5 = "mhlo.cholesky"(%4) {lower = true} : (tensor<200x200xf32>) -> tensor<200x200xf32>
In VisionOS1.1, when using the com.apple.unityplugin.core-3.1.0 & com.apple.unityplugin.gamekit-2.2.0 to sign in Apple Game Center.
var player = await GKLocalPlayer.Authenticate();
Debug.Log($"GKLocalPlayer Player: {player.DisplayName}");
Debug.Log($"GKLocalPlayer Player Alias: {player.Alias}");
it returns
GKLocalPlayer Player:
GKLocalPlayer Player Alias: Unknown
all other parameters are fine, except for the DisplayName is blank and the Alias returns "Unknown".
However, it works fine on iOS.
As per the new App Review Guidelines, are HTML5 games provided within apps required to embed in the binary?
Given that the App Review Guidelines' section 4.7 has been updated.
I wanna draw a pixel buffer directly on the screen with the Metal API.
in OpenGL I can use glDrawPixels
how to do it in Metal?
Hello,
I've been trying to render these models in a VisionOS app using RealityKit's Model3D API. The heart seem to appear dark all the time. Any thoughts on why this would happen?
Color.clear
.overlay {
Model3D(named: modelName, bundle: realityKitContentBundle) { model in
model.resizable()
.scaledToFit()
.rotation3DEffect(
Rotation3D(
eulerAngles: .init(angles: orientation, order: .xyz)
)
)
.frame(depth: modelDepth)
.offset(z: -modelDepth / 2)
.accessibilitySortPriority(1)
} placeholder: {
ProgressView()
.offset(z: -modelDepth * 0.75)
}
}
.dragRotation(yawLimit: .degrees(120), pitchLimit: .degrees(20))
.offset(z: modelDepth)
I want to render a dense point cloud in Mixed Reality view using RealityKit. How could I achieve this, if this is possible? It seems to only support rendering mesh geometries with triangle faces.
I'm using DrawableQueue to create textures that I apply to my ShaderGraphMaterial texture. My metal render is using a range of alpha values as a test.
My objects displayed with the DrawableQueue texture are working as expected, but the alpha component is not working.
Is this an issue with my DrawableQueue descriptor? My ShaderGraphMaterial? A missing setting on my scene objects? or some limitation in visionOS?
DrawableQueue descriptor
let descriptor = await TextureResource.DrawableQueue.Descriptor(
pixelFormat: .rgba8Unorm,
width: textureResource!.width,
height: textureResource!.height,
usage: [.renderTarget, .shaderRead, .shaderWrite], // Usage should match the requirements for how the texture will be used
//usage: [.renderTarget], // Usage should match the requirements for how the texture will be used
mipmapsMode: .none // Assuming no mipmaps are needed for the text texture
)
let queue = try await TextureResource.DrawableQueue(descriptor)
queue.allowsNextDrawableTimeout = true
await textureResource!.replace(withDrawables: queue)
Draw frame:
guard
let drawable = try? drawableQueue!.nextDrawable(),
let commandBuffer = commandQueue?.makeCommandBuffer()//,
//let renderPipelineState = renderPipelineState
else {
return
}
let renderPassDescriptor = MTLRenderPassDescriptor()
renderPassDescriptor.colorAttachments[0].texture = drawable.texture
renderPassDescriptor.colorAttachments[0].loadAction = .clear
renderPassDescriptor.colorAttachments[0].storeAction = .store
renderPassDescriptor.colorAttachments[0].clearColor = clearColor
/*renderPassDescriptor.colorAttachments[0].clearColor = MTLClearColor(
red: clearColor.red,
green: clearColor.green,
blue: clearColor.blue,
alpha: 0.5 )*/
renderPassDescriptor.renderTargetHeight = drawable.texture.height
renderPassDescriptor.renderTargetWidth = drawable.texture.width
guard let renderEncoder = commandBuffer.makeRenderCommandEncoder(descriptor: renderPassDescriptor) else {
return
}
renderEncoder.pushDebugGroup("DrawNextFrameWithColor")
//renderEncoder.setRenderPipelineState(renderPipelineState)
// No need to create a render command encoder with shaders, as we are only clearing the drawable.
// Since we are just clearing the drawable to a solid color, no need to draw primitives
renderEncoder.endEncoding()
commandBuffer.commit()
commandBuffer.waitUntilCompleted()
drawable.present()
}
On macOS 14.2.1 Sonoma, when our app is displayed in full screen, there is a problem where the operations at hand are not reflected at all on the connection destination.
This issue does not occur on macOS 13 Ventura.
If you stop displaying full screen and reduce the window size, you will be able to operate it to some extent.
Were there any specifications or technical changes around the screen between Ventura and Sonoma?
Mac at hand* - Relay server - Windows to connect to
*This does not occur on windows, android, or ios.
I am running the RoomPlan Demo app and keep getting the above error and when I try to find someplace to get the archive in the Metal Libraries my searches come up blank. There are no files that show up in a search that contain such identifiers. A number of messages are displayed about "deprecated" interfaces also. Is it normal to send out demo apps that are hobbled in this way?