In the presentation there was a reference to a Parent/Child relationship. The parent for doors and Windows might be Walls but the relationship between Walls or multiple Windows or doors would be that of a "Sibling" or "Peer" component to maintain logical consistency of the objects. I would also recommend a Junction Object, which means more than an Edge because it would describe the two peer objects that were being joined, and the angle of that junction between the objects. The Junction could also describe the shape of the junction to capture any curvature or discontinuity of the junction. Multiple junctions might also be peers because they would have a junction between other adjoining surfaces giving a more complete description to the structure of the room. The parent of all of these surfaces and junctions would be the Room itself. Such a description would be useful in an architectural review of the room structure.
Post
Replies
Boosts
Views
Activity
I was happy to see the improvements in the Capture Side of RoomPlan, particularly the Polygon handling for the walls. Unfortunately, the Walls with 5 Vertices (Great Room Ceilings, Garage exteriors, etc) are projected in the viewers as having 4 Vertices and the top edge of the wall is projected upwards to the highest point between the two vertices leading to the peak. I am presuming that from this manifestation there must be some presumed 4-ness of all walls that leads to use of the Extent of the uppermost point rather than a projection of the lines to the point where the lines meet at the top. What viewer can be used that properly handles this misrepresentation of the imagery? I have imagery but your download only accepts pure files and will not take a USDZ file even retyped as TXT files
The definition for PolygonCorners given in the document at the index https://developer.apple.com/documentation/roomplan/capturedroom/surface/polygoncorners.
Is woefully lacking. To validly link the sides of a polygon from the definition of a corner there needs to be an incoming and outgoing line descriptor that points to the next subsequent corner of the polygon. Just like a line has two endpoints, the edges of a polygon have corners to which any corner may be linked and it is an ordered set forwards and backwards, in the case of a 2D polygon. In the case of a 3D polygon, such a corner would likely have at the very least third directional connection giving depth of that point. In a 3D geometry you may have one point that has multiple points going in another direction much like the facet cuts of a Brilliant Cut Diamond. This sibling relationship of the corner points would complicate the definition but not impossibly so.
Application fails almost immediately after initial entry to rendering code.
Could not locate file '.' in bundle.
Class for component already registered
Registering library () that already exists in shader manager. Library will be overwritten.
Resolving material name 'engine:BuiltinRenderGraphResources/AR/suFeatheringCreateMergedOcclusionMask.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
What additional packages need updates to run successfully? I have updated the Xcode and other packages available in Beta set
I am running the RoomPlan Demo app and keep getting the above error and when I try to find someplace to get the archive in the Metal Libraries my searches come up blank. There are no files that show up in a search that contain such identifiers. A number of messages are displayed about "deprecated" interfaces also. Is it normal to send out demo apps that are hobbled in this way?
Precisely what is the difference between measurements from the LiDAR camera depth data and that provided by the TrueDepth camera? Where is the camera selection done so I can examine the code that sets up the LiDAR camera?
Why are there differences between the Sample application in the download and the code presented in the video? Specifically there are different Imports. I noticed that when I made references to some of the AR functions in. change I made it would not compile clean until I added an import of the ARKit and the Reality Kit even though both are in the RoomPlan import. Does the scope of definitinons get limited to that of the Imported symbols?
I have some confusion over the definition of MediaType versus DeviceType on the calls. It seems to me that because of similarities between the various cameras and the vague dichotomy between TrueDepth and LiDARData for measuring Depth there may be some need for refinement of the calls. My own preference would be to pass an Attribution Mask based on the bit position of the various characteristics of the device and having the bits prioritized so the BEST selection of the available devices would be chosen. For example With True Depth and LiDAR cameras both have depth data but the LiDAR is preferrable for accuracy reasons but some applications could get by with the photogrammetry mode True Depth camera if the LiDAR camera was unusable for some reason. I could see an option that said "Depth but no LiDAR" being selected. The same thing would be true for telephoto lenses versus combined with Wide or Ultrawide (or Hyper-Wide later). Audio streaming would have the same kind of issues with performance levels for input and output being narrowly defined. The question also arises why Device Type allows an array to be passed but not Media Type. It would seem reasonable for both.