Hi,
if I correctly understand what you wish to do;
1) “create pre-built models and animations for dynamic element models which can skip to particular frames in the code depending on data” — In RealityKit and Composer one of the behavior actions which can be added is ‘send notify message to Xcode’ consider placing one of these ‘actions’ at various points in a specific scene or object’s ‘overall behavior’ where you wish to have a the Xcode ‘do something’ in response to being ‘notified’ by the ‘action’ in your ‘dynamic element’ in RealityKit-Composer.If, instead, you meant; you want Xcode to first respond to ‘data’ (perhaps from a Core ML Model) and based on the ‘data stream’ to then perform code assigned to a RealityKit scene, object grouping, or single object entity; that too, is possible, but may require more intricate coding in Xcode.
2) coordinating with a UIView or SKView subview, unprojecting points from the 3D space to 2D to align the elements.
if I understand correctly, you’d be using UIview of SKView sub view for the UI aspect of taking your existing app and going AR with it?
So, the desired outcome is to move the ‘coordinate space’ of a specific 3D Object or points in a 3D ‘AR box’ and ‘pancaking’ them into a 2D interface? ‘Unprojecting’ May be the wrong term; it sounds more like a ‘coordinate space transform’ to me.
While there may be an app for that; there’s at least Xcode. I do not know the ‘how’ to do it, but my internal ML says it can be done without too much bloodshed.
3) Are there other approaches to consider, especially ones that would allow me to include 3D elements.
Your ‘approach’ should usually be connected to ‘the identifiable needs’ of your conversion to AR space for your app. My experience has been that you’ll benefit by trying multiple strategies and approaches to any given ‘how do do this in RealityKit or Composer?’ Being ‘open’ to applying new strategies is half the battle. AR is still a very ‘new space’ for us Developers to explore. A sense of curiosity is almost as important as specific detailed knowledge about ‘this or that’.
Part of the ‘untold secrets’ of effectively using the great capabilities Apple has given us with ARkit,RealityKit-Composer-Converter, Scenekit and SpriteKit together in the AR ‘experience’ is going back and forth between ‘what result you want’ and comparing it with the ‘available capabilities of the tools’. By refining the ‘fit’ between them, strategies, approaches and solutions have fertile ground to emerge.
I hope this helps.
Best,
Catfish