Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics

Post

Replies

Boosts

Views

Activity

Crash after using Apply unity plugin (GameKit-3.0.0)
Hi, I tried to save the game progress using the official Apple plugin for Unity but the crash happened when I active the "iCloud Documents" inside capabilities and when deactivated it this error message appeared: Code=27 Domain=GKErrorDomain Description=The requested operation could not be completed because you are not signed in to iCloud or have not enabled iCloud Drive. (UbiquityContainerUnavailable) Authentication with Game Center works fine using the Core plugin, but nothing works correctly when I use the GameKit plugin. Note: I already active iCloud for app Identifier. Tichnecal informations: Unity version: 2022.3.47f1 LTS XCode 16 Swift 6 GameKit-3.0.0 (Apply unity plugin) Core-3.1.5 (Apply unity plugin)
2
0
246
2w
build issue with game-porting-toolkit
I am trying to install the game-porting-toolkit using brew -v install apple/apple/game-porting-toolkit but this fails each time because of a dependency on a deprecated openssl version: Fetching dependencies for apple/apple/game-porting-toolkit: cmake, ninja, apple/apple/game-porting-toolkit-compiler, openssl1.1 ... ... Error: openssl@1.1 has been disabled because it is not supported upstream! It was disabled on 2024-10-24. Is there a way to override this dependency or use a newer version of openssl for the check?
2
2
645
2w
SceneKit app randomly crashes with EXC_BAD_ACCESS in jet_context::set_fragment_texture
Every now and then my SceneKit game app crashes and I have no idea why. The SCNView has a overlaySKScene, so it might also be SpriteKit's fault. The stack trace is #0 0x0000000241c1470c in jet_context::set_fragment_texture(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>> const&, jet_texture*) () #27 0x000000010572fd40 in _pthread_wqthread () Does anyone have an idea where I could start debugging this, without being able to consistently reproduce it?
11
0
640
Sep ’24
SKAction.removeFromParent() causes crash when run in SCNView.overlaySKScene on iOS
Even when the action is run on the main thread, the following code causes a crash on iOS, but not on macOS. The game launches with a simple yellow rectangle, and when it finishes fading out and should be removed from the overlay scene, the app crashes. The code can be pasted into the file GameController.swift of Xcode's default project for Multiplatform macOS and iOS game. import SceneKit import SpriteKit @MainActor class GameController: NSObject { let scene: SCNScene let sceneRenderer: SCNSceneRenderer init(sceneRenderer renderer: SCNSceneRenderer) { sceneRenderer = renderer scene = SCNScene(named: "Art.scnassets/ship.scn")! super.init() sceneRenderer.scene = scene renderer.overlaySKScene = SKScene(size: CGSize(width: 500, height: 500)) DispatchQueue.main.async { let node = SKShapeNode(rect: CGRect(x: 100, y: 100, width: 100, height: 100)) node.fillColor = .yellow node.run(.sequence([ .fadeOut(withDuration: 1), .removeFromParent() ])) renderer.overlaySKScene!.addChild(node) } } } The Xcode console shows this stacktrace: *** Assertion failure in -[UIApplication _performAfterCATransactionCommitsWithLegacyRunloopObserverBasedTiming:block:], UIApplication.m:3246 *** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Call must be made on main thread' *** First throw call stack: ( 0 CoreFoundation 0x00000001804ae0f8 __exceptionPreprocess + 172 1 libobjc.A.dylib 0x0000000180087db4 objc_exception_throw + 56 2 Foundation 0x0000000180d17058 _userInfoForFileAndLine + 0 3 UIKitCore 0x00000001853cf678 -[UIApplication _performAfterCATransactionCommitsWithLegacyRunloopObserverBasedTiming:block:] + 376 4 UIKitCore 0x000000018553f7a0 -[_UIFocusUpdateThrottle scheduleProgrammaticFocusUpdate] + 300 5 UIKitCore 0x0000000184e2e22c -[UIFocusSystem _requestFocusUpdate:] + 548 6 UIKitCore 0x0000000184e2dfa4 -[UIFocusSystem requestFocusUpdateToEnvironment:] + 76 7 UIKitCore 0x0000000184e2e864 -[UIFocusSystem _focusEnvironmentWillDisappear:] + 408 8 SpriteKit 0x00000001a3d472f4 _ZL12_removeChildP6SKNodeS0_P7SKScene + 240 9 SpriteKit 0x00000001a3d473b0 -[SKNode removeChild:] + 80 10 SpriteKit 0x00000001a3d466b8 -[SKNode removeFromParent] + 128 11 SpriteKit 0x00000001a3d1678c -[SKRemove updateWithTarget:forTime:] + 64 12 SpriteKit 0x00000001a3d1b740 _ZN11SKCSequence27cpp_updateWithTargetForTimeEP7SKCNoded + 84 13 SpriteKit 0x00000001a3d20e3c _ZN7SKCNode6updateEdf + 156 14 SpriteKit 0x00000001a3d20f20 _ZN7SKCNode6updateEdf + 384 15 SpriteKit 0x00000001a3d26fb8 -[SKScene _update:] + 464 16 SpriteKit 0x00000001a3cf3168 -[SKSCNRenderer _update:] + 80 17 SceneKit 0x000000019c932bf0 -[SCNMTLRenderContext renderSKSceneWithRenderer:overlay:atTime:] + 60 18 SceneKit 0x000000019c9ebd98 -[SCNRenderer _drawOverlaySceneAtTime:] + 204 19 SceneKit 0x000000019cb1a1c0 _ZN3C3D11OverlayPass7executeERKNS_10RenderArgsE + 60 20 SceneKit 0x000000019c8e05ec _ZN3C3D13__renderSliceEPNS_11RenderGraphEPNS_10RenderPassERtRKNS0_9GraphNodeERPNS0_5StageENS_10RenderArgsEbRPU27objcproto16MTLCommandBuffer11objc_object + 2660 21 SceneKit 0x000000019c8e18ac _ZN3C3D11RenderGraph7executeEv + 3808 22 SceneKit 0x000000019c9ed26c -[SCNRenderer _renderSceneWithEngineContext:sceneTime:] + 756 23 SceneKit 0x000000019c9ed544 -[SCNRenderer _drawSceneWithNewRenderer:] + 208 24 SceneKit 0x000000019c9ed9fc -[SCNRenderer _drawScene:] + 40 25 SceneKit 0x000000019c9edce4 -[SCNRenderer _drawAtTime:] + 500 26 SceneKit 0x000000019ca87950 -[SCNView _drawAtTime:] + 368 27 SceneKit 0x000000019c943b74 __83-[NSObject(SCN_DisplayLinkExtensions) SCN_setupDisplayLinkWithQueue:screen:policy:]_block_invoke + 44 28 SceneKit 0x000000019ca50600 -[SCNDisplayLink _displayLinkCallbackReturningImmediately] + 132 29 libdispatch.dylib 0x000000010239173c _dispatch_client_callout + 16 30 libdispatch.dylib 0x0000000102394c14 _dispatch_continuation_pop + 756 31 libdispatch.dylib 0x00000001023aa4e0 _dispatch_source_invoke + 1736 32 libdispatch.dylib 0x00000001023997f0 _dispatch_lane_serial_drain + 340 33 libdispatch.dylib 0x000000010239a774 _dispatch_lane_invoke + 420 34 libdispatch.dylib 0x00000001023a71a8 _dispatch_root_queue_drain_deferred_wlh + 324 35 libdispatch.dylib 0x00000001023a6604 _dispatch_workloop_worker_thread + 488 36 libsystem_pthread.dylib 0x000000010242bb74 _pthread_wqthread + 284 37 libsystem_pthread.dylib 0x000000010242a934 start_wqthread + 8 ) libc++abi: terminating due to uncaught exception of type NSException Am I doing something wrong?
6
0
317
Oct ’24
SceneKit app seriously hangs when run in fullscreen
I've been running my SceneKit game for many weeks in Xcode without performance issues. The game itself is finished, so I thought I could go on with publishing it on the App Store, but when archiving it in Xcode and running the archived app, I noticed that it seriously hangs. The hangs only seem to happen when I run the game in fullscreen mode. I tried disabling game mode, but the hangs still happen. Only when I run in windowed mode the game runs smoothly. Instruments confirms that there are many serious hangs, but it also reports that CPU usage is quite low during those hangs, on average about 15%. From what I know, hangs happen when the main thread is busy, but how can that be when CPU usage is so low, and why does it only happen in fullscreen mode for release builds?
2
0
296
Oct ’24
OS choosing performance state poorly for GPU use case
I am building a MacOS desktop app (https://anukari.com) that is using Metal compute to do real-time audio/DSP processing, as I have a problem that is highly parallelizable and too computationally expensive for the CPU. However it seems that the way in which I am using the GPU, even when my app is fully compute-limited, the OS never increases the power/performance state. Because this is a real-time audio synthesis application, it's a huge problem to not be able to take advantage of the full clock speeds that the GPU is capable of, because the app can't keep up with real-time. I discovered this issue while profiling the app using Instrument's Metal tracing (and Game tracing) modes. In the profiling configuration under "Metal Application" there is a drop-down to select the "Performance State." If I run the application under Instruments with Performance State set to Maximum, it runs amazingly well, and all my problems go away. For comparison, when I run the app on its own, outside of Instruments, the expensive GPU computation it's doing takes around 2x as long to complete, meaning that the app performs half as well. I've done a ton of work to micro-optimize my Metal compute code, based on every scrap of information from the WWDC videos, etc. A problem I'm running into is that I think that the more efficient I make my code, the less it signals to the OS that I want high GPU clock speeds! I think part of why the OS is confused is that in most use cases, my computation can be done using only a small number of Metal threadgroups. I'm guessing that the OS heuristics see that only a small fraction of the GPU is saturated and fail to scale up the power/clock state. I'm not sure what to do here; I'm in a bit of a bind. One possibility is that I intentionally schedule busy work -- spin threadgroups just to waste energy and signal to the OS that I need higher clock speeds. This is obviously a really bad idea, but it might work. Is there any other (better) way for my app to signal to the OS that it is doing real-time latency-sensitive computation on the GPU and needs the clock speeds to be scaled up? Note that game mode is not really an option, as my app also runs as an AU plugin inside hosts like Garageband, so it can't be made fullscreen, etc.
3
0
265
2w
Can the simulator be used to test a GameKit app?
I'm sure this question was asked many times before but I cannot find a good answer. In my case it just doesn't work. Here's what I did Created a couple of test user accounts on appstore connect under the sandbox section. Launched two different simulators via xcode. On each simulator I logged in into test icloud account and game center accordingly. Deployed and launched my code via xcode. What happens is that GameKit code fails to find any peers no matter what I do. Sending Invite doesn't work either because the simulator displays an error message saying "you need to log-in into icloud account first" although it's already logged in. I tried the same code on two different physical devices with two real icloud accounts and it works as expected but it's not a viable path to develop and debug an app. I'm using the latest Xcode 16.1 running on 15.1. Anybody has any clue how to solve this ?
1
0
202
3w
SceneKit custom physics fields using wrong position?
In the simplest case I can come up with, I create a scene (either fully or partially in code) with a single dynamic body, located slightly away from the origin. I give the body a charge as well as adding an electric field to the node. Body does nothing (as to be expected, since it's the source of the field). However if I replace that field with a custom field (does nothing except reports back the passed in position value) the position shown is the location of the body in the local space of its parent (in this case, the root node) rather than the node the field is attached to (i.e. itself). I've attached the code customising the SwiftUI app template. Hopefully someone can tell me what I'm doing wrong? ContentView customisation… struct ContentView: View { var body: some View { SceneView(scene: ElectricScene(), options: [.allowsCameraControl, .autoenablesDefaultLighting]) } } And the code to create the scene… import Foundation import SceneKit class ElectricScene: SCNScene { override init() { super.init() physicsWorld.gravity = SCNVector3(0, 0, 0) let cameraNode = SCNNode() cameraNode.camera = SCNCamera() cameraNode.position = SCNVector3(0, 0, 10) rootNode.addChildNode(cameraNode) let ballNode = SCNNode(geometry: SCNSphere(radius: 0.5)) ballNode.position = SCNVector3(2, 0, 0) ballNode.physicsBody = SCNPhysicsBody(type: .dynamic, shape: nil) ballNode.physicsBody?.charge = -1 rootNode.addChildNode(ballNode) // ballNode.physicsField = SCNPhysicsField.electric() ballNode.physicsField = SCNPhysicsField .customField {position, _, _, _, _ in print(position) return SCNVector3Zero } } @available(*, unavailable) required init?(coder: NSCoder) { fatalError("init(coder:) has not been implemented") } } This (repeatedly) prints out the following… SCNVector3(x: 2.0, y: 0.0, z: 0.0) …which is the position of the node relative to the root node, rather than relative to the source of the field (itself).
1
0
267
Oct ’24
What does CAMetalLayerWantsCompositingDependencies in Info.plist do?
I've noticed a major third-party app has the following flag set to 1/true in its Info.plist: CAMetalLayerWantsCompositingDependencies Does anyone know if it’s recognized by Core Animation / Metal, and what it’s supposed to do? It might obviously have zero relationship to the OS, defined by that app and for that app... but since it looks very much like an unofficial/undocumented environment setting, it might be great to know what problem it solves. I happen to have issues related to compositing other CALayers over a CAMetalLayer in my app... so this definitely stood out as interesting. Thank you!
0
0
173
2w
Metal Inline Functions
Hi! How to define and call an inline function in Metal? Or simple function that will return some value. Case: inline uint index4D(constant _4D& shape, constant uint& n, constant uint& c, constant uint& h, constant uint& w) { return n * shape.C * shape.H * shape.W + c * shape.H * shape.W + h * shape.W + w; } When I call it in my kernel function I get No matching function for call error. Thx in advance.
2
0
202
3w
RealityView Attachments on iOS 18 & Visually Appealing AR Labeling Alternatives
I want use SwiftUI views as RealityKit entities to display AR Labels within a RealityKit scene, and the labels could be more complicated than just text and window as they might include images, dynamic texts, animations, WebViews, etc. Vision OS enables this through RealityView attachments, and there is a RealityView support on iOS 18. Tried running RealityView attachments code samples from VisionOS on iOS 18. However, the code below gives errors on iOS 18: import SwiftUI import RealityKit struct PassportRealityView: View { let qrCodeCenter: SIMD3<Float> let assetID: String var body: some View { RealityView { content, attachments in // Setup your AR content, such as markers or 3D models if let qrAnchor = try? await Entity(named: "QRAnchor") { qrAnchor.position = qrCodeCenter content.add(qrAnchor) } } attachments: { Attachment(id: "passportTextAttachment") { Text(assetID) .font(.title3) .foregroundColor(.white) .background(Color.black.opacity(0.7)) .padding(5) .cornerRadius(5) } } .frame(width: 300, height: 400) } } When I remove "attachments" keyword and the block, the errors are kind of gone. That does not help me as I want to attach SwiftUI views to Anchor Entities in RealityKit. As I understand, RealityView attachments are not supported on iOS 18. I wonder if there is any way of showing SwiftUI views as entities on iOS 18 at this point. Or am I forced to use the text meshes and 3d planes to build the UI? I checked out the RealityUI plugin, but it's too simple for my use case of building complex AR labels. Any advice would be appreciated. Thanks!
0
0
210
3w
How to receive keyboard/mouse on VisionOS?
I tried using the GameController APIs for this, but they didn't seem to work. Is that the recommended API for handling keyboard/mouse? The notifications for mouse and keyboard connect/disconnect don't seem to be defined for visionOS. The visionOS 2.0 touts keyboard and mouse support. The simulator can even forward keyboard/mouse to the app. But there don't seem to be any sample code of how to programatically receive either of these. The game controller works fine (on device, not on Simulator).
1
0
189
3w
Metal Compute Overhead
Hello, We are experimenting with Metal to accelerate some peculiar numerical computation. Our workloads are relatively small, so the ability to avoid moving data to and from the GPU's memory is very appealing. However, we are observing higher overhead compared to CUDA, which negates the benefits of avoiding data transfer. In our tests using an empty kernel, CUDA completes in 0.001 ms (Intel i7 10700K, RTX 3080), while Metal's waitUntilCompleted takes 0.12 ms (M2 Max). As we do not have prior experience with Metal, we are wondering if we are using the APIs just fine and this timing is expected, or if there is a way to reduce it. Thank you in advance for any comment! test-metal.cpp
0
0
215
3w
GCControllerDidConnect notification not received in VisionOS 2.0
I am unable to get VisionOS 2.0 (simulator) to receive the GCControllerDidConnect notification and thus am unable to setup support for a gamepad. However, it works in VisionOS 1.2. For VisionOS 2.0 I've tried adding: .handlesGameControllerEvents(matching: .gamepad) attribute to the view Supports Controller User Interaction to Info.plist Supported game controller types -> Extended Gamepad to Info.plist ...but the notification still doesn't fire. It does when the code is run from VisionOS 1.2 simulator, both of which have the Send Game Controller To Device option enabled. Here is the example code. It's based on the Xcode project template. The only files updated were ImmersiveView.swift and Info.plist, as detailed above: import SwiftUI import GameController import RealityKit import RealityKitContent struct ImmersiveView: View { var body: some View { RealityView { content in // Add the initial RealityKit content if let immersiveContentEntity = try? await Entity(named: "Immersive", in: realityKitContentBundle) { content.add(immersiveContentEntity) } NotificationCenter.default.addObserver( forName: NSNotification.Name.GCControllerDidConnect, object: nil, queue: nil) { _ in print("Handling GCControllerDidConnect notification") } } .modify { if #available(visionOS 2.0, *) { $0.handlesGameControllerEvents(matching: .gamepad) } else { $0 } } } } extension View { func modify<T: View>(@ViewBuilder _ modifier: (Self) -> T) -> some View { return modifier(self) } }
1
0
358
Sep ’24
PhotogrammetrySession on non Pro Iphone
Hello, I'm creating an app that use PhotogrammetrySession Class to build 3D objects from photographs (https://developer.apple.com/documentation/realitykit/creating-3d-objects-from-photographs). I'm wondering why this class is working only on Pro iphone (12 Pro, 13 Pro, 14 Pro, 15 Pro and 16 Pro) and none non-Pro iPhone. My app does not use Lidar so it's not the problem. I thought it could be power-related but a18 soc from iPhone 16 is more powerful than a14 bionic from iPhone 12 Pro (i could also mention iPhone 13 Pro and iPhone 14 that both have a15 bionic whereas only the first one is compatible). Did I miss something that could explain these restrictions ? Is there any plan to make this class usable by every iPhone enough powerful to run it ? Thanks in advance for answering me
0
0
135
3w
SKTexture used for SceneKit object is rendered too bright
I would like to preload and use some images for both SpriteKit and SceneKit models (my game uses SceneKit with a SpriteKit overlay), and as far as I can see the only efficient way would be to create and preload SKTexture objects which can be supplied to SKSpriteNode(texture:) and SCNMaterial.diffuse.contents. The problem is that SKTexture are rendered too bright in SceneKit, for some unknown reason. Here a comparison between rendering an image (from URL) and a SKTexture: And the code that produces it: let url = Bundle.main.url(forResource: "art.scnassets/texture.png", withExtension: nil)! let plane1 = SCNPlane(width: 10, height: 10) plane1.firstMaterial!.diffuse.contents = url.path let node1 = SCNNode(geometry: plane1) node1.position.x = -5 scene.rootNode.addChildNode(node1) let plane2 = SCNPlane(width: 10, height: 10) plane2.firstMaterial!.diffuse.contents = SKTexture(image: NSImage(byReferencing: url)) let node2 = SCNNode(geometry: plane2) node2.position.x = 5 scene.rootNode.addChildNode(node2) This issue was already mentioned in this other post, but since I wasn't notified of the reply from Quinn asking about the feedback number I created at the time, it didn't make any progress.
5
0
273
3w