I am working on an App and I am in the process of adding Syphon support.
Syphon uses CFMessagePort for IPC and passing of FrameBuffer data (MTLTexture) between apps - and is widely used in the professional video app and video production space.
What I have noticed is that when the App is built as a Sandbox app, during the Syphon initialization, I see the following error message in the log:
*** CFMessagePort: bootstrap_register(): failed 1100 (0x44c) 'Permission denied', port = 0x8703, name = 'info.v002.Syphon.D2499DBD-93AE-4CEA-B21F-FF356DCC069D'
See /usr/include/servers/bootstrap_defs.h for the error codes.
Syphon uses the "info.v002.Syphon.UUID" naming convention to identify IPC Syphon servers, so I don't think I can use the App Groups naming convention for Sandbox support.
I have a very simple example app on github that publishes SpriteKit frames as a Syphon Server. To see the issue, simply enable App Sandbox for the build, and run the app. You should see the error message in the log and no data appears in any Syphon Client (I use Syphon Recorder for testing - available at syphon.github . io
I am looking for other options to enable CFMessagePorts on a Sandbox App.
SpriteKit
RSS for tagDrawing shapes, particles, text, images, and video in two dimensions using SpriteKit.
Posts under SpriteKit tag
47 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Hello everyone
My goal is to create Apple's activity ring sparkle effect. So I found Paul Hudson's Vortex library. There is already a sparkle effect, but I don't know how to modify it to achieve my goal. Because I'm pretty new to SwiftUI animations. Does anyone have any idea how I could do this?
Vortex project: https://github.com/twostraws/Vortex
I'm trying to add a SpriteKit scene to my view using SpriteView(scene: spriteScene).
But it's doesn't work the .sks scene doesn't display
here is the image.
(and this is the code :)
import SwiftUI
import _SpriteKit_SwiftUI
struct Gameplay: View {
let spriteScene: MainGameScene
var body: some View {
ZStack {
-----------------------
SpriteView(scene: spriteScene)
----------------------------
}
}
}
I'm writing a SpriteKit game in Swift that does this familiar dance in my GameScene touchesEnded(...)...
let touch = touches.first!.location(in: self)
let nodes = self.nodes(at: touch)
...but sometimes when the game is running, the nodes collection is empty, even though there are clearly, visibly nodes under the touch. All of the nodes I care about are children of a single SKNode, MapNode, that is one of two children of the GameScene.
After a bunch of debugging, I've noticed that when this problem happens, something is causing calculateAccumulatedFrame() to return a frame that is way, way smaller than the real bounds of MapNode.
When the scene begins, calculateAccumulatedFrame() is perfect, but something that happens later on breaks it. I've tried adding a node to MapNode that is the correct size to 'force' calculateAccumulatedFrame to do the right thing, but this doesn't work.
Is there something obvious I am doing wrong, or some common pitfalls? Is there some way to force calculateAccumulatedFrame to return this correct size?
Looking for info on the best way to convert a MTLTexture to SKTexture.
I am migrating a project where a SpriteKit scene is rendered in an SKView - and relies heavily on SKView.texture(from: SKNode) API
I am trying to migrate to an SKRenderer based SpriteKit scene - and wondering the best way to grab an SKTexture from an SKNode when rendered in a MTKView instead of SKView?
I'm trying to add an image in UIImage format to SpritKit's SKSpriteNode.
When converting a UIImage into a texture to SKTexture and adding it to SKSpriteNode, the image that takes into account the orientation of the image held by the UIImage is not displayed on the screen.
I tried the code below, but the result is identical.
Code1:
let image: UIImage?
let texture: SKTexture = SKTexture(image: image!)
ver imageNode = SKSpriteNode(texture: texture)
Code2:
let image: UIImage?
let cgImage = image?.cgImage
let ciImage = CIImage(cgImage: cgImage!)
let orientedImage = UIImage(cgImage: CIContext(options: nil).createCGImage(ciImage, from: ciImage.extent)!, scale: 0, orientation: image!.imageOrientation)
let texture: SKTexture = SKTexture(image: orientedImage)
ver imageNode = SKSpriteNode(texture: texture)
Code3:
let image: UIImage?
guard let cgImage = image?.cgImage else { return }
let orientedImage = UIImage(cgImage: cgImage, scale: image!.scale, orientation: .up)
let texture = SKTexture(image: orientedImage)
ver imageNode = SKSpriteNode(texture: texture)
Is there a way to ensure that the image orientation is taken into account when displayed?
I am trying to make an application for the Vision Pro where the particles don't move but rather stay still so that there is no lag. For example I am trying to spawn in a 100 particles here:
I want the particles to remain static but spawning in many causes the simulator to lag. Also is there maybe a way i can get a particle system to follow a specific shape like the one i have in the image.
Currently, I have multiple model entities that take on a particle system component
for i in 0..<100 {
let newEntity = ModelEntity()
var particleSystem = particleSystem(color: newColor)
newEntity.components.set(particleSystem)
newEntity.position = position
newEntity.scale = scale
stars.append(newEntity)
}
}
func particleSystem(color: UIColor) -> ParticleEmitterComponent {
var particles = ParticleEmitterComponent()
particles.emitterShapeSize = .init(repeating: 0.02)
// make burst smaller
particles.emitterShape = .sphere
particles.mainEmitter.birthRate = 1
particles.mainEmitter.lifeSpan = 2
particles.mainEmitter.size = 0.02
particles.burstCount = 50
particles.speed = 0.01
particles.mainEmitter.isLightingEnabled = false
particles.mainEmitter.color = .constant(.single(color))
return particles
}
An SCNNode is created and used for either an SCNView or an SKView.
SceneKit and SpriteKit are using default values.
The SceneView has an SCNScene with a rootNode of the SCNNode.
The SpriteKitView has a SpriteKitScene with an SK3DNode that has an SCNScene with a rootNode of the SCNNode.
There is no other code changing or adding values.
Why are the colors for the SCNView less vibrant than the colors for the SKView?
Is there a default to change to make them equivalent, or another value to add? I have tried changing the default SCNMaterial but only succeeded in making the image black or dark.
Any help is appreciated.
How do I change a UIBezierPath.currentPoint to a SKSpriteNode.position?
Here are the appropriate code snippets:
func createTrainPath() {
let startX = -tracksWidth/2,
startY = tracksPosY
savedTrainPosition = CGPoint(x: startX, y: startY!)
trackRect = CGRect(x: savedTrainPosition.x,
y: savedTrainPosition.y,
width: tracksWidth,
height: tracksHeight)
trainPath = UIBezierPath(ovalIn: trackRect)
trainPath = trainPath.reversing() // makes myTrain move CW
} // createTrainPath
Followed by:
func startFollowTrainPath() {
let theSpeed = Double(5*thisSpeed)
var trainAction = SKAction.follow(
trainPath.cgPath,
asOffset: false,
orientToPath: true,
speed: theSpeed)
trainAction = SKAction.repeatForever(trainAction)
createPivotNodeFor(myTrain)
myTrain.run(trainAction, withKey: runTrainKey)
} // startFollowTrainPath
So far, so good (I think?) ...
Within other places in my code, I call:
return trainPath.currentPoint
I need to convert trainPath.currentPoint to myTrain.position ...
When I insert the appropriate print statements, I see for example:
myTrain.position = (0.0, -295.05999755859375)
trainPath.currentPoint = (392.0, -385.0)
which obviously disqualifies a simple = , as in:
myTrain.position = trainPath.currentPoint
Since this = is not correct, what is ?
After more investigation, my guess is that .currentPoint is in SKSpriteNode coordinates and .position is in SKScene coordinates.
I'm trying to integrate AR into my app, and I want to be able to present a the UIViewController containing the ar view after pressing a sprite in SKScene. I'm trying to look for a way to present this view controller in touchesBegan however I'm not able to find any reference on this.
I'd really appreciate any help.
I am using the following code to create a texture atlas at runtime using a single .png image sprite sheet:
func createSpriteTextureAtlas(atlasNumber atlas:Int, forWorld world:Int) {
//load the png file
let image = UIImage(named: "world\(world)_spritesheet\(atlas)_2048x2048.png")
//create the dictonary
var imageDictionary = [String: UIImage]()
//iterate through all rows and columns and get the subimage
var imageIndex = 0
for row in 0...7 {
for column in 0...7 {
let sourceRect = CGRect(x:column * 256, y:row * 256, width:256, height:256)
let sourceImage = image?.cgImage!.cropping(to: sourceRect)
let subImage = UIImage(cgImage: sourceImage!)
//add the sub image and name to the dictionary
imageDictionary["\(imageIndex)"] = subImage
imageIndex = imageIndex + 1
}
}
//create the texture atlas using the dictionary
spriteTextureAtlas[atlas] = SKTextureAtlas(dictionary: imageDictionary)
}
I have a different sprite sheet for every world. I made all the sprite sheets myself using the same tool. This code works 100% of the time for most images.
For some images however, the program crashes at: SKTextureAtlas(dictionary: imageDictionary) with the error: Thread 4: EXC_BAD_ACCESS (code=1, address=0x105ff2000). The stack trace says it is crashing inside: #0 0x00000002178e2d34 in -[SKTextureAtlasPacker isFullyOpaque:] ().
The crash does not happen every time and only happens for some images. The crash never happens on the simulator.
Did I make a mistake inside createSpriteTextureAtlas or is this a SpriteKit bug?
P.S. I already know that I can let Xcode make the texture atlas for me by using a folder with a .atlas extension but this is not what i want to do.
We have a 2023 Mac mini with an Apple M2 Pro chip. The Mac has two displays connected. If a SpriteKit application has two windows on separate desktops, one of the windows can’t run animations.
You can easily replicate this issue with the following steps:
Create a new macOS project in Xcode.
Select the sample Game project with SpriteKit as the technology.
Edit the fade duration in GameScene.swift to make the animated nodes live longer (e.g. 15 seconds).
In AppDelegate.applicationDidFinishLaunching, add code to instantiate a second window:
let mainStoryboard = NSStoryboard.init(name: NSStoryboard.Name("Main"), bundle: nil)
if let newWindow = mainStoryboard.instantiateInitialController() as? NSWindowController {
newWindow.showWindow(self)
}
Run the app. If both windows are on the same desktop, animations run correctly. If the windows are on separate desktops, then animations halt on the 1080p desktop, even though the SKView reports full frame rate.
You can also just check out the sample project from my github repo: https://github.com/lanephillips/SKTest
The issue does not occur if the application is built for Intel processors only and run on Apple Silicon under Rosetta.
We've seen this issue occur in both macOS 13 and 14. I've filed a report through Feedback Assistant, but I'm posting here in case someone else has seen this and has a workaround.
I have implemented auto renewable subscriptions in my app, as well as promo codes. Purchase of subscriptions both monthly and annual; work correctly. What I don't know is what to "listen for" instead of product, when the user uses a promo code to purchase the product. am I looking for a different product code? or product identifier when the offer code is used to subscribe?
Hello everyone
I am porting my existing 2d game writing by spritekit to visionOS
and I am creating a SpriteView in WindowGroup
let currentScene = BattleScene.newGameScene(gameMode: "endless", dataContext: dataController.container.viewContext)
SpriteView(scene: currentScene)
.ignoresSafeArea(.all)
.frame(width: currentScene.frame.width, height: currentScene.frame.height, alignment: .center)
.onReceive(NotificationCenter.default.publisher(for: GameoverNotification)) { _ in
stopAllAudio()
}
.onTapGesture { location in
let viewPosition = location
let touchLocation = CGPoint(x: viewPosition.x, y: viewPosition.y)
print("touch on vision window: ", touchLocation.x, touchLocation.y)
}
.glassBackgroundEffect()
//WindowGameView()
// .environment(\.managedObjectContext, dataController.container.viewContext)
// .environment(model)
// .environment(pressedKeys)
}
.windowStyle(.automatic)
.defaultSize(width: 0.5, height: 1.0, depth: 0.0, in: .meters)
run it and it turns out the scene can't receive tap event.
but it works normal if I run it with my ios target (vision Os designd for ipad)
is there anything I missed?
Background:
This is a question asking for a good example of SpriteKit from a very new iOS developer who is investigating for starting an iOS 2D game project.
As I explored in the official apple development doc, to dev a 2D game SpriteKit is the very framework I am looking for.
There have been some clear and succinct words for any API and class documented in the reference spec when I started a project in Xcode. However I haven't been able to finish the project as having no any general idea about what is always needed of a typical game using the framework.
Question:
As an experienced Java Spring programmer I believe that I am needed a brief example to get started with the SpriteKit framework which provides me an idea of necessary steps for a 2D game.
The error shown in the picture appears when I try to link the StartScene.sks file to the code
Attached are pictures of the code
Please help me🙏🏻
The error shown in the picture appears when I try to link the StartScene.sks file to the code
Attached are pictures of the code
Please help me🙏🏻
I have an unusual situation. I'm working through some self study training and this particular project is a game similar to fruit ninja where things are flying around the screen and you slice through them by swiping your finger. Anyway the instructor provided a series of "*.caf" files for various sounds while playing the game. For some reason I can't hear any of the sounds on my personal Mac when running the app via Xcode to either an iOS iPad simulator or to a real iPad that I have attached to my mac and setup for developer use in Xcode.
Yes sound is working on the mac (hear new email come in etc. and can play videos in the browser).
In the iPad simulator I can play youtube videos in mobile safari. While adjusting the sound doesn't seem to raise or lower the volume I can hear the audio in the youtube videos.
On the real iPad: yes the sound is on and I can hear audio while playing videos in the browser etc.
Lastly, I pushed the code up to GitHub and pulled it to my work Mac. There the sound plays just find in the iOS iPad simulator on that Mac (not allowed to attach my personal iPad to my work mac so can't try the iPad there). So I fell like this confirms the code is correct for playing the sounds in the running app.
Oh, and I can select the file in left nav and click the little play button icon to hear it played in Xcode.
so, I feel like this is an issue on my personal Mac with Xcode etc., but I'm at a loss as to what the issue is. Can anyone suggest some things to look for on my Mac to get sound working in this app?
if you want to see the code it's here and the free training i'm following is at https://www.hackingwithswift.com/100/77.
TIA
I need to associate sound with the movement of a sprite. Movement can be as a result of physics, not as a result of an SKAction. When the object is sliding thee should be sliding sound throughout the time when it is sliding, and then a different sound when it bumps into a rock and goes up in the air. When the object is airborne, there is no sound, till it falls again - a falling sound, and then slides down with a sliding sound. The sounds associated with the collision ( rock, ground and so on ) are straightforward and work fine. But am having difficulty associating the sound with movement.
The closest result I have is to check the velocity of the sprite's physics body every update cycle and play or stop the sound based on whether the velocity is greater than zero. I tried SKAction.playSoundFileNamed first - the sound kept going even when the object was not moving. I tried adding an SKAudioNode with Play and Stop, with no better result. I finally tried using AVAudioPlayer to play and Pause , which yielded the best results, but the sliding sound still played past the sliding action.
What is the best way to do this?
My code for playing the sound is as follows:
var blockSliding = false
for block in gameBlocks {
if (block.physicsBody?.velocity.dx ?? 0) + (ball.physicsBody?.velocity.dy ?? 0) > 0.05 {
blockSliding = true
break
}
}
if slideSound.isPlaying {
if !blockSliding {
slideSound.pause()
}
} else {
if blockSliding {
slideSound.play()
}
}
I have setup slideSound earlier loading the appropriate sound file into an AVAudioPlayer
Hi all
So I'm quite new into GameDev and am struggling a bit with the Tilemap
All my elements have the size of 64x64. As you can see in my screenshot there is some gap between the street and the water. It might be simple but what's the best way to fix that gap? I could increase the width of the left and right edge png but then I will sooner or later run into other problems as it then is not fitting with the rest.
Thanks for your help
Cheers from Switzerland