Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics

Post

Replies

Boosts

Views

Activity

Swift Package with Metal
Hi, I've got a Swift Framework with a bunch of Metal files. Currently users have to manually include a Metal Lib in their bundle provided separately, to use the Swift Package. First question; Is there a way to make a Metal Lib target in a Swift Package, and just include the .metal files? (without a binary asset) Second question; If not, Swift 5.3 has resource support, how would you recommend to bundle a Metal Lib in a Swift Package?
10
1
7.3k
Jun ’20
How to play Vorbis/OGG files with swift?
Does anyone have a working example on how to play OGG files with swift? I've been trying for over a year now. I was able to wrap the C Vorbis library in swift. I then used it to parse an OGG file successfully. Then I was required to use Obj-C\++ to fill the PCM because this method seems to only be available in C\++ and that part hangs my app for a good 40 seconds to several minutes depending on the audio file, it then plays for about 2 seconds and then crashes. I can't get the examples on the Vorbis site to work in objective-c and i tried every example on github I could find (most of which are for iOS - I want to play the files on mac) I also tried using Cricket Audio framework below. https://github.com/sjmerel/ck It has a swift example and it can play their proprietary soundbank format but it is also supposed to play OGG and it just doesn't do anything when trying to play OGG as you can see in the posted issue https://github.com/sjmerel/ck/issues/3 Right now I believe every player that can play OGGs on mac is written in Objective-C or C++. Anyway, any help/advice is appreciated. OGG format is very prevalent in the gaming community. I could use unity, which I believe plays oggs through the mono framework but I really really want to stay in swift.
1
0
3.3k
Dec ’20
GameCenter scores are not being posted to the leaderboard
Hello! Bare with me here, as there is a lot to explain! I am working on implementing a Game Center high score leaderboard into my game. I have looked around for examples of how to properly implement this code, but have come up short on finding much material. Therefore, I have tried implementing it myself based off information I found on apples documentation. Long story short, I am getting success printed when I update my score, but no scores are actually being posted (or at-least no scores are showing up on the Game Center leaderboard when opened). Before I show the code, one thing I have questioned is the fact that this game is still in development. In AppStoreConnect, the status of the leaderboard is "Not Live". Does this affect scores being posted? Onto the code. I have created a GameCenter class which handles getting the leaderboards and posting scores to a specific leaderboard. I will post the code in whole, and will discuss below what is happening. PLEASE VIEW ATTACHED TEXT TO SEE THE GAMECENTER CLASS! GameCenter class - https://developer.apple.com/forums/content/attachment/0dd6dca8-8131-44c8-b928-77b3578bd970 In a different GameScene, once the game is over, I request to post a new high score to Game Center with this line of code: GameCenter.shared.submitScore(id: GameCenterLeaderboards.HighScore.rawValue) Now onto the logic of my code. For the longest time I struggled to figure out how to submit a score. I figured out that in Xcode 12, they deprecated a lot of functions that previously worked for me. Not is seems that we have to load all leaderboards (or the ones we want). That is the purpose behind the leaderboards private variable in the Game Center class. On the start up of the app, I call authenticate player. Once this callback is reached, I call loadLeaderboards which will load the leaderboards for each string id in an enum that I have elsewhere. Each of these leaderboards will be created as a Leaderboard object, and saved in the private leaderboard array. This is so I have access to these leaderboards later when I want to submit a score. Once the game is over, I am calling submitScore with the leaderboard id I want to post to. Right now, I only have a high score, but in the future I may add a parameter to this with the value so it works for other leaderboards as well. Therefore, no value is passed in since I am pulling from local storage which holds the high score. submitScore will get the leaderboard from the private leaderboard array that has the same id as the one passed in. Once I get the correct leaderboard, I submit a score to that leaderboard. Once the callback is hit, I receive the output "Successfully submitted score to leaderboard". This looks promising, except for the fact that no score is actually posted. At startup, I am calling updatePlayerHighScore, which is not complete - but for the purpose of my point, retrieves the high score of the player from the leaderboard and is printing it out to the console. It is printing out (0), meaning that no score was posted. The last thing I have questions about is the context when submitting a score. According to the documentation, this seems to just be metadata that GameCenter does not care about, but rather something the developer can use. Therefore, I think I can cross this off as causing the problem. I believe I implemented this correctly, but for some reason, nothing is posting to the leaderboard. This was ALOT, but I wanted to make sure I got all my thoughts down. Any help on why this is NOT posting would be awesome! Thanks so much! Mark
7
1
4.1k
Dec ’20
How to use GCVirtualController directly with SKScene?
GCVirtualController isn't displaying when used with SKScene class. The Virtual controllers appear but then it seems that they are obscured by the SKScene itself!? The documentation says that calling connect() will display the virtual controllers but I seem to be missing how to add the controllers to the SKScene? class GameScene: SKScene { private var _virtualController: Any? @available(iOS 15.0, *) public var virtualController: GCVirtualController? { get { return self._virtualController as? GCVirtualController } set { self._virtualController = newValue } } override func didMove(to view: SKView) { let background = SKSpriteNode(imageNamed: ".jpg") background.zPosition = -1 addChild(background) let virtualConfig = GCVirtualController.Configuration() virtualConfig.elements = [GCInputLeftThumbstick, GCInputRightThumbstick, GCInputButtonA, GCInputButtonB] virtualController = GCVirtualController(configuration: virtualConfig) virtualController?.connect() } } I've also tried adding the virtual controllers in the UIViewController but this doesn't work either.
5
0
1.9k
Oct ’21
Display USDZ 3D model on website without AR
I am trying to build a website where I would like to render the USDZ 3D model on the browser without the AR feature. The user should be able to interact with the 3D model using a pointing device (mouse). If the user wants to see the 3D model in AR she/he can do so by loading the page on a compatible device where the 3D model can be projected in AR. I am looking for an answer to how to display the USDZ 3D model on the browser without the AR feature.
2
1
3.4k
Jan ’22
Metal Ray Tracing, RealityKit, SwiftUI Problems
First of all, I apologize for such a general question, but my code is far too long to post. However, I have narrowed down my problems and am seeking advice for any solutions/workarounds that may help me. I recently updated my physics simulation code from using Metal Performance Shaders for ray tracing to using the (fairly) new Metal ray tracing routines directly. A few notes on my program: I perform the ray tracing entirely in a separate thread using Compute Kernels -- there is no rendering based on the ray tracing results. The compute kernels are called repeatedly and each ends with a waitUntilCompleted() command. The main thread is running a SwiftUI interface with some renderings that use RealityKit to display the scene (i.e., the vertices), and the rays that traverse the scene using Metal Ray Tracing. This is purely a Mac program and has no IOS support. So, the problem is that there seems to be some conflict between RealityKit rendering and the ray-tracing compute kernels where I will get a "GPU Soft Fault" when I run the "intersect" command in Metal. After this soft-fault error, my Ray Tracing results are completely bogus. I have figured out a solution to this which is to refit my acceleration structures semi-regularly. However, this solution is inelegant and probably not sustainable. This problem gets worse the more I render in my RealityKit display UI (rendered as a SwiftUI view) so I am now confident that the problem is some "collision" between the GPU resources needed by my program and RealityKit. I have not been able to find any information on what a "GPU Soft Fault" actually is although I suspect it is a memory violation. I suspect that I need to use fences to cordon off my ray tracing compute kernel from other things that use Metal (i.e., RealityKit), however I am just not sure if this is the case or how to accomplish this. Again, I apologize for the vague question, but I am really stuck. I have confirmed that every Metal buffer I pass to my compute kernel is correct. I did this confirmation by making my object a simple cube and having only one instance of this cube. Something happens to either corrupt the acceleration structure data or to make it inaccessible during certain times when RealityKit needs to use the GPU. Any advice would be appreciated. I have not submitted a bug report since I am still not sure if this is just my lack of advanced knowledge of multiple actors requiring GPU use or if there is something more serious here. Thanks in advance, -Matt
2
0
1.2k
Feb ’22
RealityKit MeshResource generated from SwiftUI shape
On Scenekit, using SCNShapewe can create SCN geometry from SwiftUI 2D shapes/beziers:https://developer.apple.com/documentation/scenekit/scnshape Is there an equivalent in RealityKit? Could we use the generate(from:) for that?https://developer.apple.com/documentation/realitykit/meshresource/3768520-generate https://developer.apple.com/documentation/realitykit/meshresource/3768520-generate
2
0
1.2k
Mar ’22
Cannot get Nintendo Switch Joycon accelerator motion data in iOS 16
Now I am using iOS 16 beta 6, I can get buttonA/buttonB is pressed event. but cannot get accelerator motion data info.      //work     buttonA?.valueChangedHandler = {(_ button: GCControllerButtonInput, _ value: Float, _ pressed: Bool) -> Void in       print(">>> ButtonA tapped")     }     //work     buttonB?.valueChangedHandler = {(_ button: GCControllerButtonInput, _ value: Float, _ pressed: Bool) -> Void in       print(">>> ButtonB tapped")     }           // this is not works     gameController.motion?.valueChangedHandler = { (motion: GCMotion)->() in       print(">>>> motion data \(motion.acceleration.x) \(motion.acceleration.y) \(motion.acceleration.z)")       if let delegate = self.motionDelegate {         delegate.motionUpdate(motion: motion)       }     } Is there any plan to support this?
4
3
2.7k
Aug ’22
CIFilter + SpriteKit broken behavior on iOS 16
Hello there 👋 I've noticed a different behavior between iOS 15 and iOS 16 using CIFilter and SpriteKit. Here is a sample code where I want to display a text and apply a blurry effect on the same text in the back of it. Here is the expected behavior (iOS 15): And the broken behavior on iOS 16: It looks like the text is rotated around the x-axis and way too deep. Here is the sample code: import UIKit import SpriteKit class ViewController: UIViewController {     var skView: SKView?     var scene: SKScene?     override func viewDidLoad() {         super.viewDidLoad()         skView = SKView(frame: view.frame)         scene = SKScene(size: skView?.bounds.size ?? .zero)         scene?.backgroundColor = UIColor.red         view.addSubview(skView!)         skView!.presentScene(scene)         let neonNode = SKNode()         let glowNode = SKEffectNode()         glowNode.shouldEnableEffects = true         glowNode.shouldRasterize = true         let blurFilter = CIFilter(name: "CIGaussianBlur")         blurFilter?.setValue(20, forKey: kCIInputRadiusKey)         glowNode.filter = blurFilter         glowNode.blendMode = .alpha         let labelNode = SKLabelNode(text: "MOJO")         labelNode.fontName = "HelveticaNeue-Medium"         labelNode.fontSize = 60         let labelNodeCopy = labelNode.copy() as! SKLabelNode         glowNode.addChild(labelNode)         neonNode.addChild(glowNode)         neonNode.addChild(labelNodeCopy)         neonNode.position = CGPoint(x: 200, y: 200)         scene?.addChild(neonNode) } }
5
4
2.9k
Sep ’22
GKLocalPlayer.Local.FetchItems() Task Error on Unity
We are using apple unity plugin (gamekit) to authorized player using game center account. To get player the info we run a task from the plugin, var fetchItemsResponse = await GKLocalPlayer.Local.FetchItems(); But when this code run, there is an error on the xcode application on mac. The error is the following, Thread 1: EXC_BAD_ACCESS (code=257, address=0x2)
15
0
3.9k
Dec ’22
How do you incorporate WKInterfaceSKScene into a watchOS Starter Template App?
I'm trying to create a Apple Watch game using Xcode 14.2 and watchOS 9. Getting started creating a watchOS App seems pretty straight forward, and getting started creating a game project via the starter template seems easy enough. Trying to put these two together though doesn't seem to work (or is not straight foward). The documentation specifies limitations on what libraries can be used with watchOS noting WKInterfaceSKScene, but doesn't give any specific examples of how to start out a WatchOS project using this. Additionally nearly every online tutorial that I'm able to find uses Storyboards to create a watchOS game, which does not seem to be supported in the latest version of watchOS or Xcode. Can anyone provide example starter code using the watchOS App project starter that that loads with a small colored square on the screen that moves from left to right using the WKInterfaceSKScene library? I've tried the Apple documentation, asking ChatGPT for a sample or reference links, and various tutorials on YouTube and elsewhere.
2
1
1.3k
Feb ’23
OpenGL on future iPhones and Macs?
Hello everyone! After some time to think about I proceed with graphics api, I figured opengl will be my first since I'm completely new to graphics programming. As in my last post you may find, I was speaking on moltenvk and might just use metal instead, along with the demos I found using metal. So for now, and I know this is said MANY TIMES, apple deprecated opengl but wish to use it because I'm new to graphics programming and want to develop an app(a rendering engine really) for the iPhone 14 Pro Max and macOS Ventura 13.2(I think this is the latest). So what do you guys think? Can I still use opengl es on the 14 max, along with opengl 4+ on latest macOS even though is deprecated?
16
0
6.6k
Feb ’23
Accessibility and Screen Recording Permissions for Helper and Main Bundles
Guys, In my main application bundle, I have included a helper bundle in its Resources. When the helper requests Accessibility permission, the system modal window displays what the helper is requesting permission for. However, when the helper requests permission for Screen Recording, the system modal window displays that the main application bundle is requesting permission, which includes the helper. This issue seems to be specific to Ventura, as both requests are displayed on behalf of the helper in Monterey. I'm wondering if this is a known issue or limitation or if there is a way to make the permission request specifically from the helper.
1
1
676
Feb ’23
I will quit from Apple development no profit from my developer account unfortunately besides other companies I work to I have profit!
Hello community this post is for show my complete unsatisfied with Apple specially on developing games for Apples platforms there is lack of support for it for example some new gaming technologies and still that there is no profit or worth from all the work and money invested to develop for it I will close the journey with Apple very unsatisfied I'm going to give opportunities with my business to other platforms that are really worth it and give support to all new technologies in gaming and yes Apple destroyed other gaming makers with their new services like arcade and seems no future for gaming in Apples platforms. Quit goodbye and good luck to everyone.
3
0
1.7k
Mar ’23
Could not locate file 'default-binaryarchive.metallib' in bundle.
I am running the RoomPlan Demo app and keep getting the above error and when I try to find someplace to get the archive in the Metal Libraries my searches come up blank. There are no files that show up in a search that contain such identifiers. A number of messages are displayed about "deprecated" interfaces also. Is it normal to send out demo apps that are hobbled in this way?
5
0
2.2k
Apr ’23
How To Resize An Image and Retain Wide Color Gamut
I'm trying to resize NSImages on macOS. I'm doing so with an extension like this. extension NSImage { // MARK: Resizing /// Resize the image to the given size. /// /// - Parameter size: The size to resize the image to. /// - Returns: The resized image. func resized(toSize targetSize: NSSize) -> NSImage? { let frame = NSRect(x: 0, y: 0, width: targetSize.width, height: targetSize.height) guard let representation = self.bestRepresentation(for: frame, context: nil, hints: nil) else { return nil } let image = NSImage(size: targetSize, flipped: false, drawingHandler: { (_) -> Bool in return representation.draw(in: frame) }) return image } } The problem is, as far as I can tell, the image that comes out of the drawing handler has lost the original color profile of the original image rep. I'm testing it with a wide color gamut image, attached. This becomes pure red when examing the image result. If this was UIKit I guess I'd use the UIGraphicsImageRenderer and select the right UIGraphicsImageRendererFormat.Range and so I'm suspecting I need to use the NSGraphicsContext here to do the rendering but I can't see what on that I would set to make it use wide color or how I'd use it?
2
0
1.1k
Apr ’23
Transferring Apps with iCloud KVS
Hi All! I'm being asked to migrate an app which utilizes iCloud KVS (Key Value Storage). This ability is a new-ish feature, and the documentation about this is sparse [1]. Honestly, the entire documentation about the new iCloud transfer functionality seems to be missing. Same with Game Center / GameKit. While the docs say that it should work, I'd like to understand the process in more detail. Has anyone migrated an iCloud KVS app? What happens after the transfer goes through, but before the first release? Do I need to do anything special? I see that the Entitlements file has the TeamID in the Key Value store - is that fine? <key>com.apple.developer.ubiquity-kvstore-identifier</key> <string>$(TeamIdentifierPrefix)$(CFBundleIdentifier)</string> Can someone please share their experience? Thank you! [1] https://developer.apple.com/help/app-store-connect/transfer-an-app/overview-of-app-transfer
3
0
1.6k
May ’23
Code Signing an app including a binary Metallib
Hi! I am currently trying to upload my iOS app to App Store Connect. Unfortunately, code signing fails with the following error: "Code object is not signed at all.", referencing a binary Metallib (created with metal-tt and an mtlp-json script). I am using Xcode's automatically managed signing and the binary metallib is located inside the "Resources" directory of a framework that I am including with "Embed and sign" in the app. Could anyone give some guidance on what I need to change to make code signing work? Thank you.
4
0
892
May ’23