Post

Replies

Boosts

Views

Activity

Reply to Some question about visionOS
Number one is prevented for privacy reasons. There are some ways to communicate to the system which shape should be used when a hover is occurring, but there is no way to act on the event. The Hover highlighting is even performed out-of-process so it can't be used maliciously, such as my ad-companies. Number Two I don't know Number three may be called "Billboarding" and there is an example of that in the splash splash sample code.
Nov ’23
Reply to Document Based SwiftData support in Immersive Scene?
I notice that once I open a second ModelContainer on same URL as the document that's open in the DocumentGroup, then all saves to the DocumentGroup's model container fail with: Error saving the database Error Domain=NSCocoaErrorDomain Code=134020 "The model configuration used to open the store is incompatible with the one that was used to create the store." UserInfo={NSAffectedObjectsErrorKey=<NSManagedObject: 0x6000021b97c0> (entity: Blah; id: 0x60000026e0c0 <x-coredata:///Blah/tAC19CF5F-052B-4CF6-B7CD-EDA188FC54BE13>; data: { id = "91E56F61-CFE0-42E4-9EA9-EAD4256B64AB"; imageData = nil; name = "Untitled Blah"; })} I will assume that this is just not a good idea right now, and conclude that Document based SwiftData apps do not work well with multiple scenes.
Nov ’23
Reply to ARKit, visionOS: Creating my own data provider
Most of the data providers give you some mesh information that you then need to place into an entity as a collision component. If you’re trying to get planes or world meshes to test interactions with, you may try adding those entities yourself instead of going through the ARKit providers. I did this with some Planes so I could test plane interactions like placing things on walls.
Nov ’23
Reply to Retrieve Normal Vector of Tap location? RayCast from device(head)?
Alright. Good riddance. This worked for me: NOTE: There's a BIT of oddness with raycasting to a tap gesture's location. Sometimes it fails, which is confusing to me given the tap succeeded. Maybe I'm not converting the locations correctly? Maybe it works better on device? In a tap gesture handler, get the tap location on a collision shape with: let worldPosition: SIMD3<Float> = value.convert(value.location3D, from: .local, to: .scene) With a running WorldTrackingProvider you can get the current device pose with worldTracking.queryDeviceAnchor(atTimestamp: CACurrentMediaTime()) Then process it like so to get it world-space: let transform = Transform(matrix: pose.originFromAnchorTransform) let locationOfDevice = transform.translation You can then do a raycast to a tap location in world-coordinate-space like so: let raycastResult = scene.raycast(from: locationOfDevice, to: worldPosition) If successful, an entry in the raycast result will have normal information. Here I grab the first one guard let result = raycastResult.first else { print("NO RAYCAST HITS?????") } let normal = result.normal Make a quaternion to rotate from identity to the normal vector's angle: // Calculate the rotation quaternion to align the forward axis with the normal vector let rotation = simd_quatf(from: SIMD3<Float>(0, 1, 0), to: normal) Apply it to an entity: cylinder.transform.rotation = rotation
Nov ’23