Post not yet marked as solved
I'm porting a scenekit app to RealityKit, eventually offering an AR experience there. I noticed that when I run it on my iPhone 15 Pro and iPad Pro with the 120Hz screen, the framerate seems to be limited to 60fps. Is there a way to increase the target framerate to 120 like I can with sceneKit?
I'm setting up my arView like so:
@IBOutlet private var arView: ARView! {
didSet {
arView.cameraMode = .nonAR
arView.debugOptions = [.showStatistics]
}
}
Post not yet marked as solved
My app was recently transferred from account X to Y. We went through all of the documentation and it's gone ok bar one big issue.
The app update that we last sent to Test Flight no longer has access to our apps App Group shared user defaults.
It's been suggested to me to delete the app group id from the old account, and recreate it in our new account. I'd like to confirm a couple of points before I proceed.
Will the production version of the app be affected if I delete the app group from the old account.
After recreating the app group in the new account, will the data in shared user defaults become available again?
Post not yet marked as solved
Looks like the particle system for scenekit is not available as a template anymore in xcode 11. I can only see the SpriteKit version. Is this by design or a mistake, or is there another way to create one?
Post not yet marked as solved
Hi all, is there anything available in the iOS SDKs to allow me to find and connect to a bluetooth speaker? At the moment I have to direct users to iOS settings and do it from there, but I would like to have an in-app experience for this.
Thanks!
Post not yet marked as solved
I can see my pi listed in iOS settings, and I can connect to it that way. I'm also trying to connect using CoreBluetooth but the pi never appears as a Peripheral I can connect to.
Should I be taking another approach? Would it appear as an ExternalAccessory instead?
I have a Python server on the device that I'd like to connect to.
Thanks!
Post not yet marked as solved
I have a 3D scene with a perspective camera and I'd like some of the elements to be projected using an orthographic projection instead.
My use case is that I have some 3D elements with attached text nodes. I'd like the text on these nodes to always be the same size no matter how far away the camera is. Is there a way I can use SceneKit to mix and match? Or is there another technique I can use?
Post not yet marked as solved
Hi team, are there any plans to add a small detent to this controller? Or the ability to set a custom height? I love that this is here, but without one of those features I fear I won't be able to use it as much as I'd like to. Thanks!
https://developer.apple.com/documentation/uikit/uisheetpresentationcontroller/detent
Post not yet marked as solved
With the iOS privacy changes I'm a bit confused when we need to show the privacy pop up to request permission, and what needs to be gated behind the answer?
We use mixpanel/ga, and from my reading of the iOS 14 documents it seems we may need to ask permission, even if the IDFA is not used. Is this the correct interpretation?
Post not yet marked as solved
I'd like to intercept mouse clicks in my SCNScene while keeping the defaultCameraController active. How can I do this? I currently implement
override func mouseDown(with event: NSEvent)
on my NSViewController - but the function doesn't get called when using the cameraController.
I have the following code to create a circle Path and a SCNShape from this path. orbit_gradient.png is a horizontal 1px high image that represents a gradient.let material = SCNMaterial()
material.isDoubleSided = true
material.lightingModel = .constant
material.diffuse.contents = UIImage(named: "art.scnassets/orbit_gradient.png")
let shapePath = Path.circle(radius: radius, segments: 512)
let orbitShape = SCNShape(shapePath)
orbitShape.materials = [material]
self.orbitNode.geometry = orbitShapeThe problem I have is that when applying the texture to this geometry I'm just left with a white circle. If I set the diffuse to be UIColor.red it displays as red. What I want to accomplish is a stroked circle that appears to gradually fade, creating a rotating trail effect.
Post not yet marked as solved
The following code works when I run on an iOS 12 device and simulator, but not when I run on iOS 13 (b3) devices and simulators. The completion handler doesn't get called. I have the Media Library Usage Description key set in my info.plistfunc requestAccess() {
print("requesting access NOW")
MPMediaLibrary.requestAuthorization { (status) in
print("Access returned")
if status == .authorized {
print("Allowed!")
} else {
print("Something else")
}
}
}