Post

Replies

Boosts

Views

Activity

How does rendering to a higher resolution RenderTarget and then downsampling to a Drawable cause image distortion?
Rendering the scene onto a RenderTarget with twice the resolution of the Drawable, and then downsampling to the Drawable, causes the image to appear distorted. Modifications were made on the Xcode VisionOS template Foveation should be enabled by default struct ContentStageConfiguration: CompositorLayerConfiguration { func makeConfiguration(capabilities: LayerRenderer.Capabilities, configuration: inout LayerRenderer.Configuration) { configuration.depthFormat = .depth32Float configuration.colorFormat = .bgra8Unorm_srgb let foveationEnabled = capabilities.supportsFoveation configuration.isFoveationEnabled = foveationEnabled let options: LayerRenderer.Capabilities.SupportedLayoutsOptions = foveationEnabled ? [.foveationEnabled] : [] let supportedLayouts = capabilities.supportedLayouts(options: options) configuration.layout = supportedLayouts.contains(.layered) ? .layered : .dedicated } } To avoid errors, rasterizationRateMap is not set. var renderPassDescriptor = MTLRenderPassDescriptor() renderPassDescriptor.colorAttachments[0].texture = self.renderTarget.currentFrameColor renderPassDescriptor.renderTargetWidth = self.renderTarget.currentFrameColor.width renderPassDescriptor.renderTargetHeight = self.renderTarget.currentFrameColor.height renderPassDescriptor.colorAttachments[0].loadAction = .clear renderPassDescriptor.colorAttachments[0].storeAction = .store renderPassDescriptor.colorAttachments[0].clearColor = MTLClearColor(red: 0.0, green: 0.0, blue: 0.0, alpha: 0.0) renderPassDescriptor.depthAttachment.texture = self.renderTarget.currentFrameDepth renderPassDescriptor.depthAttachment.loadAction = .clear renderPassDescriptor.depthAttachment.storeAction = .store renderPassDescriptor.depthAttachment.clearDepth = 0.0 //renderPassDescriptor.rasterizationRateMap = drawable.rasterizationRateMaps.first if layerRenderer.configuration.layout == .layered { renderPassDescriptor.renderTargetArrayLength = drawable.views.count } The rendering process is as follows:
2
0
326
Apr ’24
When using CompositeLayer to render immersive spaces through the Metal API, the screen resolution cannot reach 4K
When using CompositeLayer to render immersive spaces through the Metal API, regardless of whether foveation is enabled or not, the resolution of the Drawable is always 1920x1824, which cannot output 4K quality. The setup code is as follows: struct ContentStageConfiguration: CompositorLayerConfiguration { func makeConfiguration(capabilities: LayerRenderer.Capabilities, configuration: inout LayerRenderer.Configuration) { configuration.depthFormat = .depth32Float configuration.colorFormat = .bgra8Unorm_srgb let foveationEnabled = capabilities.supportsFoveation configuration.isFoveationEnabled = foveationEnabled // or false let options: LayerRenderer.Capabilities.SupportedLayoutsOptions = foveationEnabled ? [.foveationEnabled] : [] let supportedLayouts = capabilities.supportedLayouts(options: options) configuration.layout = supportedLayouts.contains(.layered) ? .layered : .dedicated } } In the render loop, the resolution of the Drawable's color texture is always 1920x1824.
0
0
170
Apr ’24
Safari Developer Tools Crashes When Recording or Stopping Recording on WebGL Web Applications on Mac
Hello everyone, I'm currently having an issue with Safari's developer tools on my Mac. I'm working on a WebGL application with a large amount of code. When I try to analyze the timeline of the code, the developer tools crash every time I try to start or stop recording. Here's some more specific information about my situation: MacBook Pro 13-inch, 2020, 2GHz Intel Core i5, Intel Iris Plus Graphics 1536MB, 32 GB 3733 MHz LPDDR4x, Ventura 13.1 Safari 16.2 (18614.3.7.1.5) Here's the specific steps I've been taking: connect my mobile device to my Mac. open the page on my mobile device in Safari. select the webpage I need to analyze from the Develop menu in Safari. click record in the Timeline. I would really appreciate any help or suggestions on what could be causing this issue and how to resolve it. Thank you in advance!
1
2
1k
Jun ’23
Why is the GPU performance of iPhone12pro worse than iPhone11pro?
The same graphics program runs in the webview, and the iPhone 12Pro has 15+% more GPU usage than the iPhone 11Pro. In theory, the performance of the A14 should be stronger than that of the A13. Is it possible to increase the GPU overhead by 15+% just because there are more 45x96 pixels? The fragment profiling in XCode Instrument, please see the screenshot.
0
0
933
Apr ’22