ARMeshGeometry? holy cr*p... what have you done?

so let me get this straight Apple, you obsoleted SCNGeometry,

where we could actually initialize new geometry...

You replaced it with ARMeshGeometry, but we can't actually create this, as it is read only... or "Entities", which doesn't seem to have half the ability to create...

and there is no way to use SCNGeometry in RealityKit / ARSessionDelegate? with out just simply doing things in a deprecated, or soon to be deprecated way?

we have to use your kits that you will soon tell us are deprecated, instead use kits that we can't actually create anything in?

are you really going to not give us a replacement for SCNGeometry/SCNNodes? "Entities", don't seem to have half the usefulness... are you going to open up ARMeshGeometry? or what in the heck is the plan here?

you should of seen the rest of this rant... before I got wise and decided to bite my lip with slightly more pressure ...

lets see if the forums actually produce something with some sort of an answer, as apparently they are supposed to be improved?

Replies

I believe ARMeshGeometry is supposed to represent the real world geometry as determined from the lidar (and maybe camera?) data. It’s not a replacement for SCNGeometry because they do different things.
While ARMeshGeometry is a rep of "real world" that doesn't mean we couldn't actually manipulate that mesh.... or parts of it... by copying parts of it and transferring it to ...well.. there is NOTHING AT ALL to transfer it too.... but that doesn't matter, as Apple apparently has absolutely nothing to replace SCN Geometry at all either?

so we have this black box, showing us all this nice data... and absolutely no way to put in Geometry except for what appears to be cubes and planes for student projects and/or simple stuff....

the worst part is they made it so we can't even use the old SCNGeometry at all in the same view.... so we are left with basically nothing?

which part of that decision does anyone think makes sense?

obviously this will change in the future..... but what in the heck... why close down SCNGeometry with nothing to replace it?

I literally am left holding this device realizing I can't actually use the device to create anything of value... head sunk deeply in hands.

and yes it is beta, but what are we supposed to test? that this black box is showing us data that we can't manipulate to make suggestions?

so here is the feedback for the beta... (nice view wish I were there?)

what is it that we are testing? finding problems with? I can't tell if there are any problems as I read in this data... and can't do anything with it...

I can gather it, I can put it in a file... look at it... (yes, nice view) but can't read it back in.. can't change it... can't... well you get the idea. let me have something to test... I have a feeling they will release many more devices very soon... and people are going to be wondering where all the apps are, (besides the student project showing us cubes, and nothing but cubes)

well I guess it is great for mine craft games :)


Is there a particular project or task you are looking to accomplish that you could before, but are unable to now? As far as I understand it, SCNGeometry has not been deprecated in any way.

To your question, no, there is no way to use SCNGeometry in RealityKit, as RealityKit serves as an alternative to SceneKit (with regards to Augmented Reality apps), not as a replacement. There are many cases in which SceneKit proves to be a more suitable choice for Augmented Reality, and also cases where RealityKit serves to be a more suitable choice.

ARMeshGeometry is a component of ARKit, totally independent of SceneKit or RealityKit. ARKit runs the AR experience, whereas SceneKit and RealityKit handle the rendering of 3D content for use in AR. You can use ARMeshGeometry alongside SceneKit, though from your posts, it seems like you are looking for an easy way to take the ARMeshGeometry and create a SCNGeometry from it, which is not what it does.

My understanding may be a bit more rudimentary, but I find that the ARMeshGeometry (which includes the classification of the "type" of surface the mesh's vertices include) to be incredibly valuable for the types of apps I am looking to build. There are many different use cases for AR; many want to use the LiDAR camera to build 3D representations of the world around them, creating point clouds and such, though for me personally, creating 3D content that interacts with doors, walls, seats, ceilings, etc. is useful. If you clarify what you are trying to achieve with ARMeshGeometry and SCNGeometry, perhaps others will have thoughts on how to achieve it.
to your question, That is like saying, just ignore Lidar... there is a huge thing that I can not do, nor anyone else, ... use the RealityKit mesh, change it, put different textures on it, and then use it in Scenekit, and do this in real time... (can't really run them together in a way that would work)

which of course is not really the way to go, we should just be using RealityKit... and then with in it, create sophisticated meshes in real time. simply by taking the created mesh and adding, changing, texturing it, and of course adding our own pieces sprinkled in... we can add cubes and spheres sprinkled in.. maybe great for games, but for productivity/work/professional type of software.. it is extremely limiting.

The problem is I could do things in SceneKit real time. but within that, I have no access to the mesh or Lidar from realityKit of course. this is the thing (the mesh) that can really anchor some very cool work getting done.

Scenekit's days are now numbered, SCNGeometry days are now numbered... which is fine. if we had some sort of replacement.. I am sure it will come. I am worried though that there isn't even a mention of it... which could mean a year or more.

but why did they just block SCNGeometry from working since they don't have a replacement yet? very disheartening.. I am left with "pretending" something exists to try and get code working.. fully frustrating to peer into this geometry, and not be able to create with it.

I'd much rather be wishing the mesh was tighter and knowing it will improve in the next few years, instead of wishing I could do anything with the mesh at all.

with the start of SceneKit/ARKit I was busy making apps and had one ready by the time ARKit was official,
now I am watching the time tick away from when RealityKit is dropping to more and more devices (most likely very soon) with Lidar and I don't have anything to show for it.

I feel like a ball and chain got tied to my neck, and someone threw the ball in the pond.


Sorry if I misunderstood your question, but I'm still a bit confused. SCNGeometry has not been deprecated in any way, and you can use the mesh data gathered from the LiDAR camera in SceneKit, alongside SCNGeometry. You are not required to use RealityKit to work with the LiDAR camera. The mesh data gathered from the LiDAR camera comes as an ARMeshGeometry, which is independent of RealityKit or SceneKit.

To your point, while I have no more knowledge than any other developers on these forums, I have no reason to suspect SceneKit has been deprecated in any way. There are many cases in which I opt to use SceneKit in AR apps, as no suitable alternative exists in RealityKit. Many Apple sample projects related to ARKit still leverage SceneKit, including projects posted from the latest WWDC 2020, so there certainly is capability across the board.

If you're willing to share what it is you are trying to accomplish, perhaps someone here would be able to help. I can speak for myself, but I've jumped into RealityKit where prudent for my AR apps, but am using SceneKit in other cases, and would imagine that some of the talented developers here could offer some tips on how to get started using either framework depending on your needs.
it is impossible to use a SceneKit view at the same time as RealityKit view in real time...
so you can not do real time things that one could do before with SceneKit.

you can gather the data, but you can not use it in real time, Imagine using SceneKit as before, and literally doing anything more complex than a cube/sphere... then texturing it real time, adding corners real time. or anything else that we could do in SceneKit...

that can not be done in RealityKit in realtime. nor can one use SceneKit over RealityKit to do it (you can not see/use camera at the same time in both).. you would have to stop the session, start a new SceneKit session and look at basically a model that has nothing to do with the current image coming into the camera..