Post

Replies

Boosts

Views

Activity

Xcode Error in Scene Editor and Entity/Component Architecture
Hello, I'm writing to report an issue (or a documentation error). I am using the Entity/Component Architecture incorporated in the GamePlayKit framework. Additionally, I want to take advantage of the user interface provided by the Scene Editor. This is essential for me if I want to involve more people in the project. The issue occurs when linking the user interface data with the GKScene of the aforementioned framework. The first issue arises when adding a component through the interface as shown in the image Then at that moment: if let scene = GKScene(fileNamed: "GameScene") { // Get the SKScene from the loaded GKScene if let sceneNode = scene.rootNode as! GameScene? { Scene.rootNode is nil, and the scene is not presented. However, I can work around this issue by initializing the scene separately: if let scene = GKScene(fileNamed: "GameScene") { // Get the SKScene loaded separately if let sceneNode = SKScene(fileNamed: "GameScene") as! GameScene? { But from here, two issues arise: The node contains a component, and the scene has been loaded separately When trying to access a specific entity through its SKSpriteNode: self.node?.entity // Is nil It becomes very difficult to access a specific entity. When adding a component, an entity is automatically created. This is demonstrated here: The node contains a component, and the scene has been loaded separately. I only have one way to access this entity, and since there is only one, it's easy: sceneNode.entities[0] But even so, it's not very useful because when I try to access its components, it turns out they don't exist. I just wanted to mention this because it would be very helpful for me if this issue could be resolved. Thank you very much in advance.
0
0
389
Aug ’24
Monitoring Sound Input on Output Devices with the Lowest Possible Latency on MAC and iPhone
“I am trying to monitor sound input on an output device with the lowest possible latency on MAC and iPhone. I would like to know if it is possible to send the input buffer to the output device without having to do it through the callbacks of both processes, that is, as close as possible to redirecting them by hardware. I am using the Core Audio API, specifically AudioQueue Services, to achieve this. I also use HAL for configuration, but I would not like to depend too much on HAL since I understand that it is not accessible from iOS.
0
0
587
Oct ’23