Is it just me, or does anybody else feel like SwiftUI is a new language built by an engineer who didn't realize there are storyboards or how to use storyboards?
I am watching these videos about "how magical it is" to use SwiftUI, then watching an engineer type in code to draw views and cells... Magical in what way? that he is able to recall from memory the code to type in?
I kept thinking, but... but... I can just drag in a view in a storyboard, move it around play with it there as a graphical interface... which actually is magical, I can drag a table view in, and a cell view in and actually stretch it and move it to what I want... drag an image in or a thumbnail view in and size it graphically, where this engineer decided going to menus to find aspect ratios that suspiciously look like what a Storyboards menu views look like.
so why didn't another set of engineers also create the "preview" and tying data structures to storyboards? then call it "meta magical"
now we have an entire new set of language syntax to do exactly the same thing that swift can do? what in the world sense is that? Swift is barely several years old, and now we have to learn a new set of different syntax? and that is if we want to type everything, as this engineer apparently decided that was easier than the "preview" view and dragging stuff in there... (just like from the storyboards by the way) to actually make magic happen? left me wondering if it actually worked by the time the videos were made.
again, where is the magic? typing in syntax out of thin air? I so wanted to ask this engineer to let me build that in a storyboard and show him how to use storyboards... (that was the "but why?") ???
so anyway Apple engineers, how about this, some of you also keep going with Storyboards to be able to hooking in data structures, and live previews and such... (of course this would make SwiftUI sort of.... well extra for no reason, so I am not going to hold my breath) I guess it would depend on who has more pull at Apple; The Storyboard/Xib people who probably all retired rich for doing a great job so no longer there to say (but... but...), or the new engineers who apparently didn't know storyboards existed? and came from some dark room where typing in syntax is cool... (cough) msft/engineering schools who have never seen Xcode/xib/Storyboards (uncough)
or is Apple about to abandon Swift too?
Post
Replies
Boosts
Views
Activity
struggling to find a replacement for SCNGeometry to tie to ARAnchors
the "AnchorEntity" class with a model MeshResource appears to be for school students building "boxes"/cubes...
ARMeshGeometry appears to be fully locked down... can't even create one it appears..
MDLMesh appears to be the only way to go, which is fine, except I have not found the proper way to add it to an "AnchorEntity"
I can take the ARMeshGeometry and strip out the vertexes and normals and such to construct "something".... in the new "reality" of realityKit which for some reason decided to strand SCNNodes... that is fine, except there are no ARNodes to let us take a few years of work and practical examples and construct new real time nodes as we did before to interact with AR.... (or did I miss a big chunk here?)
in other words there needs to actually be that middle ground between school student projects, and fully ray traced Metal objects... there are several use cases for this middle ground extremely practical use cases for many professions. ( if I missed it, I apologize in advance) I am looking at several years of work that can not be translated yet as far as I can tell... I am sure I can do it with a few months of work... just with a code base that looks like something out of machine language flavor... which is not fun...
in other words, what is this "something" I spoke of above? can you provide code example of taking the result of one ARMeshGeometry that is being created by lidar, take a specific piece of it like ".floor" faces... construct this "something" from it... give it a "false" color and tie it to an "AnchorEntity"... that would be an awesome example.
this seems like there would be some sample code to show such... but I have not found it yet? does it exist? if not, could you show quick example of doing it...? thanks in advance.
also why in the world did Apple just randomly construct the "ARMeshGeometry"? one thing a person looks at immediately from the sample app of the lidar mesh, is the classification... of course it is just doing it on the fly...
but a great suggestion would be a 'switch' that says put all ".floor" faces in one mesh, and ".wall" in another mesh, upon some trigger, keep it randomly mixed but a 'trigger' at some point of user choosing to "reorganize" them... not sure how that was not constructed in, sure we can separate them out, but that is one more real time process that could have been avoided (on the developer end)...
and the same with the horizontal/vertical planes constructed out of faces... one would think it would be natural to put those in their own "ARMeshGeometry"? (they are, but they are together, and it is one more process to actually try and discover which mesh they are in (as far as I can tell)) I don't think it would have been hard to simply name the anchor, and a "switch" to put them in their own mesh, (horizontal in one, vertical in another)...
why have us make the process slower by having to do it on our own?
anyway, any help would be appreciated...
so let me get this straight Apple, you obsoleted SCNGeometry,
where we could actually initialize new geometry...
You replaced it with ARMeshGeometry, but we can't actually create this, as it is read only... or "Entities", which doesn't seem to have half the ability to create...
and there is no way to use SCNGeometry in RealityKit / ARSessionDelegate? with out just simply doing things in a deprecated, or soon to be deprecated way?
we have to use your kits that you will soon tell us are deprecated, instead use kits that we can't actually create anything in?
are you really going to not give us a replacement for SCNGeometry/SCNNodes? "Entities", don't seem to have half the usefulness... are you going to open up ARMeshGeometry? or what in the heck is the plan here?
you should of seen the rest of this rant... before I got wise and decided to bite my lip with slightly more pressure ...
lets see if the forums actually produce something with some sort of an answer, as apparently they are supposed to be improved?
well, might as well see if anything changed with forums, or if we get the same either zero answers, or worse...
anyway, xCode 12 beta, running app on iPad Pro with iOS 14 beta, then Mac OS 10.15.4,
no errors running app in xCode 11.5,
same example app... "VisualizingAndInteractingWithAReconstructedScene"
get error/warning: "Could not launch “<name of app here>”
LLDB provided no error string." in xCode 12.. might be dark mode.. wrong debugger? didn't make any changes to projects, tried three projects... so unlikely debugger setting...
seems to be any project, but only ran three so far...
also getting what shows as an "error" but does not effect running app on xCode: "Failed _shouldMakeReadyForDevelopment check even though device is not locked by passcode."
let us see if forums actually improved... or.... it is same ole' same ole'