I'm trying to write a game for MacOS, but I can't find any useful information about how to include the game assets. The game is divided into scenarios, each scenario containing 20 levels.
Each level has one of 6 colour themes, so I need a set of textures for each theme and an additional set which are the same in every level. To keep things simple for now, I decided to let Xcode create an atlas for each of those sets, and have the game use 2 atlases for each level. Originally, each coloured set had the same set of leafnames, but I inferred from the official SpriteKit docs that every texture might need a unique name, so I added a prefix for each set ("b-" for blue, "r-" for red etc). The main problem I have now with the textures is that I don't know whether they should be added to the top-level of the bundle, or in Assets.xcassets. What's the difference between bundled files and assets?
That isn't my main problem though. The data for the levels is in text files structured like this:
Scenarios
|_ Jungle
| |_ 01.txt
| |_ ...
| |_ 20.txt
| |_ ...
|_ Labyrinth
| |_ 01.txt
| |_ ...
| |_ 20.txt
| |_ ...
If I add them like this, the project won't build because Xcode seems to want to flatten the directory structure, and the duplicate leafnames clash. Is there a way to keep the directory structure with duplicate leafnames, and if so, how would I refer to the files in code?
Post
Replies
Boosts
Views
Activity
I'm trying to use SpriteKit in a Mac app, but having a problem loading the textures from PNGs in the bundle with SKTexture(imageNamed:). Two of the textures load correctly; they are sized 384x256 and 256x128. But the third, which is sized 832x512, gets scaled down to 199.68x122.88. Why is it doing that?
I'm trying to convert a Mac game Xcode project to a Swift package. I can't get the SKTexture(imageNamed:) init method to work in the package though. I have some other resources, and they load successfully via Bundle.module. All the resources are listed in my Package.swift with:
resources [ .process("Resources/pathname..."), ... ]
I tried leaving out the PNGs in case SPM has a special way of handling them, but it complained, and it didn't fix the problem, so that can't be the reason.
Is it just not possible to use this convenience initializer in a package? It's not a show stopper, because I can load an NSImage with Bundle.module.image(forResource:) and initialize my texture with that, but it would be nice if the convenience method worked.
I'm fairly new to Swift and struggling to go beyond the basics with SwiftUI's state management. My app uses Firebase to authenticate itself with a server, so it goes through a series of state changes when logging in that the UI needs to respond to.
Because logging in goes through these multiple stages, I want to use async/await to make my code more readable than using a series of completion callbacks. The trouble is I can't update my state variables from an async function if SwiftUI is observing them via @ObservedObject or similar.
One way to fix that is to never update state directly from an async function, but use DispatchQueue.main.async instead. The trouble is, this adds noise, and Apple discourage it. Instead of changing how you set the state, you're supposed to change how you listen to it, by using .receive(on:).
The trouble is, I think the point where I would need to do that is buried somewhere in what @ObservedObject does behind the scenes. Why doesn't it already use .receive(on:)? Can I get or write a version that does without having to spend ages learning intricate details of SwiftUI or Combine?
My app needs to be able to scan QR codes, but my implementation is far too inefficient and makes the UI unresponsive. I based it on two different tutorials; mainly on one on raywenderlich.com about how to display a camera preview in SwiftUI without using UIViewRepresentable. I would post a link, but apparently that domain is banned here (why?). It uses AVCaptureVideoDataOutput and converts captured frames to CGImage via CIImage for use in an Image SwiftUI view. Is that one of the things that's making it slow? Would it be much more efficient to use UIViewRepresentable after all?
My code for processing QR codes is based on another tutorial whose link I can't post, using AVCaptureMetadataOutput. Is there a way to make the metadata queue drop frames when it's busy, independently of the preview's queue? Or is reducing the video frame rate the only way?
I will look into choosing a video source format/resolution that takes less processing, and at a lower frame rate, but it would be nice if I could still show a preview that's relatively smooth compared to the rate at which it's trying to decode QRs.
I've designed my app so that the server can use Firebase Cloud Messaging to request that a user's phone send it a location update (over GRPC) even while the app is in the background. I already implemented this using a generic push notification handler, before discovering that there is a special case for this, by setting certain APNS fields in the messages and adding a Location Push Service extension to my app.
My implementation seems to be working as required, but admittedly with very limited testing. So what is the advantage of using this extension? It would add some complexity, and I would have to apply for an entitlement. What do I get in return?