Render advanced 3D graphics and perform data-parallel computations using graphics processors using Metal.

Posts under Metal tag

200 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

copyFromBuffer offset and size working even when not multiple of 4
Hi, Reading the copyFromBuffer documentation states that on macOS, sourceOffset, destinationOffset, and size "needs to be a multiple of 4, but can be any value in iOS and tvOS". However, I have noticed that, at least on my M2 Max, this limitation does not seem to exist as there are no warnings and the copy works correctly regardless of the offset value. I'm curious to know if this is something that should still be avoided. Is the multiple of 4 limitation reserved for non Apple Silicon devices and that note can be ignored for Apple Silicon? I ask because I am a contributor to Metal.jl, and recently noticed that our tests pass even when copying using copyWithBuffer with offsets and sizes that are not multiples of 4. If that coul cause issues/correctness problems, we would need to fix that. Thank you. Christian
0
1
518
Jun ’24
Metal and Swift Concurrency
Hi, Introducing Swift Concurrency to my Metal app has been a bit challenging as Swift Concurrency is limited by the cooperative thread pool. GPU work is obviously not CPU bound and can block forward moving progress, especially when using waitUntilCompleted on the command buffer. For concurrent render work this has the potential of under utilizing the CPU and even creating dead locks. My question is, what is the Metal's teams general recommendation when it comes to concurrency? It seems to me that Dispatch or OperationQueues are still the preferred way for Metal bound tasks in order to gain maximum performance? To integrate with Swift Concurrency my idea is to use continuations that kick off render jobs via Dispatch or Queues? Would this be the best solution to bridge async tasks with Metal work? Thanks!
4
0
645
Jun ’24
Bug? Xcode 16 macOS 15 SDK on macos 14.5 causes Metal Shader Colors to be Wrong
I've been upgrading Xcode consistently for years and have never seen Metal shaders behave differently from one version to another until now. On macOS 14.5, Xcode 16 beta, suddenly several color outputs turn out completely black where there should be color. All validation is on and nothing seems to be wrong (and hasn't been since maybe Xcode version 11). I've attached two screens. The first is the normal color scheme, the second is in Xcode 16. The settings are the exact same. Normal: Buggy with black + transparent colors (so it seems like either colors are overflowing or are all 0s)? Before I file a bug report or code level request, may I have some thoughts on how to debug this? The only clue I have is that I'm using bindless to multiply color texture samples with color values from my vertex struct. But it still fails even if I use hard-coded values for the texture samples, meaning somehow the color values are not being sent to the shader correctly? This is the most stable part of my rendering pipeline, so I'm surprised if the issue is there. Thank you.
1
0
818
Jun ’24
[Metal Passthrough] upperLimbVisibility not respected in sample code
Hello everyone, Super exciting stuff released this year! I was playing around with the Metal passthrough sample code (see: https://developer.apple.com/documentation/compositorservices/interacting_with_virtual_content_blended_with_passthrough) ... and noticed that the upperLimbVisibility set to .automatic does not seem to work and my hand is always on top. How to reproduce: Draw something Position your hand behind the brush stroke Notice that your hands are always rendered on top Taking a GPU frame capture reveals that the depth is correctly written. Xcode: Version 16.0 beta (16A5171c) VisionOS: visionOS 2.0 (22N5252n)
1
0
536
Jul ’24
Sample Project for WWDC24 10092 Metal with Passthrough?
It’s great that we’ll be able to use Metal custom renderers in passthrough mode on visionOS. https://developer.apple.com/wwdc24/10092 This is a lot of complicated set-up, however. It’s also unclear how occlusion and custom algorithms / raytracing will work in tandem with scene understanding. May we have a project template and/or sample? Preferably with the C api and not just swift. This would be much-appreciated and helpful to everyone who wants this set-up. I’d like to see the whole process. Thank you for introducing this feature!
3
1
738
1w
Metal 3.2 device memory coherency
I am seeking clarification regarding the new device-coherent memory (buffers and textures) in Metal 3.2. Do I understand the documentation correctly that this feature allows threads from different threadgroups to update data in device memory cooperatively? The documentation mentions, "[results of operations] are visible to other threads across thread groups if you synchronize them properly." How does one do proper synchronization? From what I understand, Metal has no device-scoped barriers.
1
0
836
Jun ’24
Wrong hitTest results in iOS 17.2
We’re experiencing an issue with wrong SceneKit hit testing results in iOS 17.2 compared with iOS 16.1 when using the either Metal or OpenGLES2 engines. Tapping on a 3D model to place a SCNNode // pointInScene: tapped point let hitResults = sceneView.hitTest(pointInScene, options: nil) return hitResults.first { $0.node.name?.compare("node_name") == .orderedSame }
3
0
659
Sep ’24
Does anyone know if HDR video is supported in a RealityView?
I have attempted to use VideoMaterial with HDR HLS stream, and also a TextureResource.DrawableQueue with rgba16Float in a ShaderGraphMaterial. I'm capturing to 64RGBAHalf with AVPlayerItemVideoOutput and converting that to rgba16Float. I don't believe it's displaying HDR properly or behaving like a raw AVPlayer. Since we can't configure any EDR metadata or color space for a RealityView, how do we display HDR video? Is using rgba16Float supposed to be enough? Is expecting the 64RGBAHalf capture to handle HDR properly a mistake and should I capture YUV and do the conversion myself? Thank you
7
0
930
Oct ’24
SwiftUI full screen animation uses less energy than Metal Game template
I've got a full-screen animation of a bunch of circles filled with gradients, with plenty of (careless) overdraw, plus real-time audio processing driving the animation, plus the overhead of SwiftUI's dependency analysis, and that app uses less energy (on iPhone 13) than the Xcode "Metal Game" template which is a rotating textured cube (a trivial GPU workload). Why is that? How can I investigate further? Does CoreAnimation have access to a compositor fast-path that a Metal app cannot access? Maybe another data point: when I do the same circles animation using SwiftUI's Canvas, the energy use is "Very High" and GPU utilization is also quite high. Eventually the phone's thermal state goes "Serious" and I get a message on the device that "Charging will resume when iPhone returns to normal temperature".
0
5
797
May ’24
[CAMetalLayer nextDrawable] returning nil because allocation failed.
Why do I get this error almost immediately on starting my rendering pass? Multiline BlockQuote. 2024-05-29 20:02:22.744035-0500 RoomPlanExampleApp[491:10341] [] <<<< AVPointCloudData >>>> Fig assert: "_dataBuffer" at bail (AVPointCloudData.m:217) - (err=0) 2024-05-29 20:02:22.744455-0500 RoomPlanExampleApp[491:10341] [] <<<< AVPointCloudData >>>> Fig assert: "_dataBuffer" at bail (AVPointCloudData.m:217) - (err=0) 2024-05-29 20:05:54.079981-0500 RoomPlanExampleApp[491:10025] [CAMetalLayer nextDrawable] returning nil because allocation failed. 2024-05-29 20:05:54.080144-0500 RoomPlanExampleApp[491:10341] [] <<<< AVPointCloudData >>>> Fig assert: "_dataBuffer" at bail (AVPointCloudData.m:217) - (err=0)
7
1
1.4k
Aug ’24
Using the same texture for both input & output of a fragment shader
Hello, This exact question was already asked in this forum (8 years ago) but I can't find a definitive answer: Does Metal allow using the same color texture as both an input and output (color attachment) of a fragment shader? Is the behavior defined somewhere? I believe this results in undefined behavior under both DirectX and OpenGL, so I'd assume the same for Metal, but then why doesn't Metal warn me about this as it does on some many other "misconfigurations"? It also seems to work correctly in my case, as I found out by accident. Would love to get a clarification! Thanks ahead!
8
1
741
2w
Changing CIKernel sampler coord causes chaos
I feel like I'm missing something really simple. I've got the simplest possible CIKernel, it looks like this: extern "C" float4 Simple(coreimage::sampler s) { float2 current = s.coord(); float2 anotherCoord = float2(current.x + 1.0, current.y); float4 sample = s.sample(anotherCoord); // s.sample(current) works fine return sample; } It's (in my mind) incrementing the x position of the sampler by 1 and sampling the neighboring pixel. What I get in practice is a bunch of banded garbage (pictured below.) The sampler seems to be pretty much undocumented, so I have no idea whether I'm incrementing by the right amount to advance one pixel. The weird banding is still present if I clamp anootherCoord to s.extent() but it behaves normally if I sample s.coord() unchanged. I'm trying to write a box blur that samples / averages neighboring pixels and am completely blocked by this. What am I missing?
2
0
596
May ’24
Bluetooth keyboard events in fully immersive Vision Pro app?
I'm writing a Vision Pro app that's fully immersive and rendered using Metal. Occasionally, some users of this app would benefit from being able to use a physical keyboard (or other accessory like a game controller). It seems very straightforward to capture and handle spatial gesture events, but I cannot find an interface that allows the detection, capture, or handling of keyboard events in any of the objects associated with fully immersive metal rendering: CompositorServices, LayerRenderer, and its associated .frame, .drawable, and .drawable.view don't seem to have any accessory awareness. Can you help me handle a keyboard event?
2
0
780
May ’24
IOSurface objects aren't released in ScreenCaptureKit
I use ScreenCaptureKit, CoreVideo, CoreImage, CoreMedia frameworks to capture screenshots on macOS 14.0 and higher. Example of creating CGImageRef: CVImageBufferRef cvImageBufferRef = ..; CIImage* temporaryImage = [CIImage imageWithCVPixelBuffer:cvImageBufferRef]; CIContext* temporaryContext = [CIContext context]; CGImageRef imageRef = [temporaryContext createCGImage:temporaryImage fromRect:CGRectMake(0, 0, CVPixelBufferGetWidth(cvImageBufferRef), CVPixelBufferGetHeight(cvImageBufferRef))]; I have the next results of profiling with XCode Instruments Memory Leaks & Allocations: there is constantly increasing memory usage, but no memory leaks are detected, and there are many calls to create IOSurface objects, that have been never released. The most part of memory - All Anonymous VM - VM: IOSurface. The heaviest stack trace: [RPIOSurfaceObject initWithCoder:] [IOSurface initWithMachPort:] IOSurfaceClientLookupFromMachPort I don't have any of IOSurface objects created by myself. There are low-level calls to it. In Allocation List I can see many allocations of IOSurface objects, but there are no info about releasing it. Due to this info, how can I release them to avoid permanent increasing memory consumption?
2
0
866
May ’24
Failed to perform GPU performance counter
Hi all, I am trying to measure the performce of my video game in iOS platform. After launching the debugger and caputre GPU workload (with "frame" selected as scope), I can not do performance profiling. The error message is: "Failed to enable shader profiler. (516)". And if I export the GPU trace and reopen it, the Xcode can not find any compatible devices connected. However, the same device used for capturing is connected. Device informations are: Mac device: MacBook Air M2, 2022 Mac OS version: Sonoma 14.4 XCode version: 15.2 (15C500b) mobile device: iPhone 13 Pro Max iOS version: 15.0 GPU performance counter can be performed using the same mobile device in my colleague's Mac. So I think there might be something wrong with my Xcode.
2
0
792
May ’24
WebGPU bugreport: problem with uniform buffer
Hi, in this WebGPU example: https://skal65535.github.io/curl/index_bug_safari.html the lighting is wrong compared to Chrome's reference version. I narrowed the problem to the uniform value 'params.specular' at line 515 not being equal to the expected value 1.2f. The value is set a line at line 1078 in the uniform buffer. Platform: MacBook M1 Pro Sonoma 14.4.1 (23E224) Safari Technology Preview: Release 194 (Safari 17.4, WebKit 19619.1.11.111.2) Works ok with Chrome 124.0.6367.156 (Official Build) (arm64).
6
0
519
May ’24
Timestamped Xcode Log Message How do I parse this to find source of error?
with the latest Xcode that runs with Mac OS 14.5 Developer Beta has messages with a time and date in them There are also some other fields of an indeterminate origin/type. "2024-05-06 15:37:32.383996-0500 RoomPlanExampleApp[24190:1708576] [CAMetalLayerDrawable texture] should not be called after already presenting this drawable. Get a nextDrawable instead." specifically I need to know how the string [24190:1708576] relates to a location in my application so I can act on the message. I certainly can't find the text in the "[CAMetalLayerDrawable texture]". field anywhere in the user documentation OR the Development documentation. In order for a diagnostic message to be Actionable and remedied by a user it must identify the module and source line of the initiating code and there must be accessible documentation for users to access to get an explanation of potential remedies.. This interface fails to supply enough information to diagnose the problem. The label in [CAMetalLayerDrawable texture] cannot even be found in a search of the package information attached to the Xcode Release paired with the IOS and Mac OS system releases.
2
0
670
May ’24