Looks like the particle system for scenekit is not available as a template anymore in xcode 11. I can only see the SpriteKit version. Is this by design or a mistake, or is there another way to create one?
Post
Replies
Boosts
Views
Activity
I'd like to intercept mouse clicks in my SCNScene while keeping the defaultCameraController active. How can I do this? I currently implement
override func mouseDown(with event: NSEvent)
on my NSViewController - but the function doesn't get called when using the cameraController.
With the iOS privacy changes I'm a bit confused when we need to show the privacy pop up to request permission, and what needs to be gated behind the answer?
We use mixpanel/ga, and from my reading of the iOS 14 documents it seems we may need to ask permission, even if the IDFA is not used. Is this the correct interpretation?
Hi team, are there any plans to add a small detent to this controller? Or the ability to set a custom height? I love that this is here, but without one of those features I fear I won't be able to use it as much as I'd like to. Thanks!
https://developer.apple.com/documentation/uikit/uisheetpresentationcontroller/detent
I have a 3D scene with a perspective camera and I'd like some of the elements to be projected using an orthographic projection instead.
My use case is that I have some 3D elements with attached text nodes. I'd like the text on these nodes to always be the same size no matter how far away the camera is. Is there a way I can use SceneKit to mix and match? Or is there another technique I can use?
I can see my pi listed in iOS settings, and I can connect to it that way. I'm also trying to connect using CoreBluetooth but the pi never appears as a Peripheral I can connect to.
Should I be taking another approach? Would it appear as an ExternalAccessory instead?
I have a Python server on the device that I'd like to connect to.
Thanks!
Hi all, is there anything available in the iOS SDKs to allow me to find and connect to a bluetooth speaker? At the moment I have to direct users to iOS settings and do it from there, but I would like to have an in-app experience for this.
Thanks!
My app was recently transferred from account X to Y. We went through all of the documentation and it's gone ok bar one big issue.
The app update that we last sent to Test Flight no longer has access to our apps App Group shared user defaults.
It's been suggested to me to delete the app group id from the old account, and recreate it in our new account. I'd like to confirm a couple of points before I proceed.
Will the production version of the app be affected if I delete the app group from the old account.
After recreating the app group in the new account, will the data in shared user defaults become available again?
I'm porting a scenekit app to RealityKit, eventually offering an AR experience there. I noticed that when I run it on my iPhone 15 Pro and iPad Pro with the 120Hz screen, the framerate seems to be limited to 60fps. Is there a way to increase the target framerate to 120 like I can with sceneKit?
I'm setting up my arView like so:
@IBOutlet private var arView: ARView! {
didSet {
arView.cameraMode = .nonAR
arView.debugOptions = [.showStatistics]
}
}
I have code such as the following. The performance on the Vision Pro seems to get quite bad once I hit a few thousand of these models. It feels like I should be able to optimise this somehow, perhaps using instancing. Is that possible with RealityKit in visionOS 2?
let material = UnlitMaterial(color: .white)
let sphereModel = ModelEntity(
mesh: .generateSphere(radius: 0.001),
materials: [material])
for index in 0..<5000 {
let point = generatedPoints[index]
let model = sphereModel.clone(recursive: false)
model.position = [point.x, point.y, point.z]
parent.addChild(starModel)
}
I have some entities which use attachments to show a label next to them. I would like to change this to only show the label when the entity is being looked at / hovered over. I have the new HoverEffect component on my entity that works nicely, but I can't see how I toggle the visibility of the labels.
I have a scene with multiple RealityKit entities. There is a blue cube which I want to rotate along with all of its children (it's partly transparent).
Inside the cube are a number of child entities (red) that I want to tap.
The cube and red objects all have collision components as is required for gestures to work.
If I want to rotate the blue cube, and also tap the red objects I can't do this as the blue cube's collision component intercepts the taps.
Is there a way of accomplishing what I want?
I'm targeting visionOS 2, and my scene is in a volume.
Hi,
In my app I am using MusicLibraryRequest<Artist> to fetch all of the artists in someone's Library collection. With this response I then fetch each artists albums: artist.with([.album]).
The response from this only gives albums in the users Library collection. I would like to augment it with all of the albums for an artist from the full catalogue.
I'm using MusicKit and targeting iOS18 and visionOS 2.
Could someone please point me towards the best way to approach this?