SceneKits future

I have developed an app using SceneKit since Swift came out. Now SwiftUI is out and uses structs rather than classes. SceneKit is a cascade of classes. As a newbie, I am concerned that my code might be obsolete through depreciations soon after publication and I'd rather get ahead of the issues now.


So, is SceneKit long-term viable?


My first attempts at converting my custom classes to structs seems impossible without Apple leading the way. If I understand correctly, I can make a struct whose only member is an instance of a class, but the benefits of the struct, e.g., minimal memory, processing time, etc., are lost.

Replies

If you want to go ultra modern and start an app that's Swift-only, note that you can begin using RealityKit in replacement of SceneKit. If you're not interested in AR use-cases, you can set ARView's cameraMode to .nonAR.


// Don’t use an ARSession or camera pass through

arView.cameraMode = .nonAR

  • Wait, can RealityKit be used in place of SceneKit? I know that RealityKit offers multi thread rendering but isn't supported by watch os or tv os. Aside from that is RealityKit the successor to SceneKit?

  • to my experience with apple ecosystem it is looks like SceneKit is going to be deprecated soon.

Add a Comment

are you implying scenekit is dead?

Hello Bobjt and thank you for this clarification.


I was about to begin a game project and I chose Scenekit.

I went through tutorials from RW so I would say I am Scenekit ready.


However, I find lot of clues here on the forum, on the internet also that make me think spending my time improving my scenekit skills is not worthy.


You advise to start an App using RealityKit.

Iy makes me doubt.


I am currently beginning to learn Unity.


Would you advise me to go and learn RealityKit? And why?


Regards


Behr

Post not yet marked as solved Up vote reply of Behr Down vote reply of Behr

There are a few considerable reasons to choose RealityKit:

- RealityKit is a multithreaded engine, whereas SceneKit is single threaded. At a critical point, performance differences should be noticeable.

- RealityKit implements a modern ECS engine (entity-component system) that makes coding more efficient.

- RealityKit provides a networking layer to implement state synchronization

- for augmented reality (AR) apps only, focus is turned to RealityKit which builds on top of the scene-mesh (ARMeshAnchors) that ARKit provides using the LiDAR scanner. SceneKit doesn't have this functionality.

- RealityKit is designed to provide a formidable API for AR apps on Apple platforms, but as mentioned above, you can also use it without AR.

Dear Bobjt


Thank you for your encouraging though corporate response.

- performance differences seems favorable to ARKIT

- AR features seems very promising.


But I tryied to use RealityKit without AR the way you present it in your first response:


// Don’t use an ARSession or camera pass through
     arView.cameraMode = .nonAR


Unfortunatelly this code is raising Swift Compiler Errors at line 20.


import UIKit
import RealityKit

class ViewController: UIViewController {
   
    @IBOutlet var arView: ARView!
   
   
    override func viewDidLoad() {
        super.viewDidLoad()
       
        // Load the "Box" scene from the "Experience" Reality File
        let boxAnchor = try! Experience.loadBox()
       
        // Add the box anchor to the scene
        arView.scene.anchors.append(boxAnchor)
    }
   
    func setupArView() {
         arView.cameraMode = .nonAR
    }
}


.../ViewController.swift:28:17: Value of type 'ARView' has no member 'cameraMode'

.../ViewController.swift:28:31: Cannot infer contextual base in reference to member 'nonAR'


More over my questions would be:


1. I intend to manage Light and Camera. I know how to do it with SCENEKIT, but I can not figure out if it is (or will be) possible with REALITYKIT ?

2. Can REALITYKIT and SCENEKIT be part of the same project: are they compatible?


Regards


Behr

Post not yet marked as solved Up vote reply of Behr Down vote reply of Behr

One thing that's a blocker with RealityKit currently is the lack of support for custom geometry. We needed to show a shaded mesh during a reconstruction session, so we were driven back to ARSCNView which worked great, but of course requires abandoning ARView. Around 2:45 in this recently posted tech talk there's a comment: "We're overlaying the ARFrame image with a mesh being generated by ARKit using the LiDAR sensor... and the colors are based on a classification of what the mesh overlays" https://developer.apple.com/videos/play/tech-talks/609/


It would be extremely helpful to know if this demo used ARSCNView!. Could you tell us how this was done?


Best regards,

Jim Selikoff

Hello Jim,


I do not know what the demo in the video used for rendering, however, it is certainly possible to display the mesh with classification colors using ARSCNView.


The general idea is that you should sort the geometry elements in each ARMeshGeometry (i.e. from each ARMeshAnchor) based on the per-face classification data. Then you create an SCNGeometry using init(sources:elements:), you would have one SCNGeometry for each different ARMeshClassification case.


Once you have your SCNGeometry, you can set it's materials, and then assign it to a node.

Thanks. Looking forward to the evolution of the RealityKit ecosystem!


Best regards,


Jim

Were you ever able to get nonAR mode working? I created the following in Swift Playgrounds but no luck (view is black).

import PlaygroundSupport
import RealityKit

let arView = ARView(frame: .zero, cameraMode: .nonAR, automaticallyConfigureSession: true)

let light = PointLight()
light.light.intensity = 10000
let lightAnchor = AnchorEntity(world: [0,0,0])
lightAnchor.addChild(light)
arView.scene.addAnchor(lightAnchor)

let plane = MeshResource.generatePlane(width: 10, depth: 10)
let material = SimpleMaterial(color: .white, roughness: 0.5, isMetallic: true)
let planeEntity = ModelEntity(mesh: plane, materials: [material])
let planeAnchor = AnchorEntity(world: [0,0,0])
planeAnchor.addChild(planeEntity)
arView.scene.addAnchor(planeAnchor)

PlaygroundPage.current.setLiveView(arView)

Hi gchiste,


Also possible to do morping in RealityKit? If yes, would you mind sharing a possible workflow?

The suggestions to use ARKit / RealityKit is no valid option for me. I'm developing dungeon crawler for the Apple Watch and these frameworks are not supporting the Apple Watch.


Can we export the created mesh model using RealityKit?
  • I had same question looks like only manually by using MDLAsset.

Add a Comment
@rands


Can we export the created mesh model using RealityKit?

No, please file an enhancement request for this functionality using Feedback Assistant.

That being said, it is completely valid to use RealityKit for rendering, and then when it comes time to export the mesh, you would access the geometry from various mesh anchors, and either use SceneKit or Model I/O to export that geometry.