Post

Replies

Boosts

Views

Activity

Setting volumetric window size at openWindow()
When defining a volumetric WindowGroup, I can set the defaultSize(). It is possible to set a different volume size when opening a window with openWindow()? In my use case, I want to display potentially different models that are of different sizes inside the volumetric window, and I want to preserve each model's size. I would like to create a volumetric window that is optimally sized for each model. Alternatively I could create a volumetric window that is large enough to fit the largest model, and then reposition smaller models inside the volume to be at the front & bottom of the volume, but I haven't figured out how to do that either (Post on that question)
0
0
624
Sep ’23
Placement of model inside volumetric window?
I am having troubles placing a model inside a volumetric window. I have a model - just a simple cube created in Reality Composer Pro that is 0.2m on a side and centered at the origin - and I want to display it in a volumetric window that is 1.0m on a side while preserving the cube's origin 0.2m size. The small cube seems to be flush against the back and top of the larger volumetric window. Is it possible to initially position the model inside the volume? For example, can the model be placed flush against the bottom and front of the volumetric window? (note: the actual use case is wanting to place 3D terrain (which tends to be mostly flat like a pizza box) flush against the bottom of the volumetric window)
0
0
512
Sep ’23
Xcode Beta RC didn't have an option for vision simulator
I just downloaded the latest Xcode beta, Version 15.0 (15A240d) and ran into some issues: On start up, I was not given an option to download the Vision simulator. I cannot create a project targeted at visionOS I cannot build/run a hello world app for Vision. In my previous Xcode-beta (Version 15.0 beta 8 (15A5229m)), there was an option to download the vision simulator, and I can create projects for the visionOS and run the code in the vision simulator. The Xcode file downloaded was named "Xcode" instead of "Xcode-beta". I didn't want to get rid of the exiting Xcode, so I selected Keep Both. Now I have 3 Xcodes in the Applications folder Xcode Xcode copy Xcode-beta That is the only thing I see that might have been different about my install. Hardware: Mac Studio 2022 with M1 Max macOS Ventura 13.5.2 Any idea what I did wrong?
2
0
1.7k
Sep ’23
PerspectiveCamera in portrait and landscape modes
I have an ARView in nonAR cameraMode and a PerspectiveCamera. When I rotate my iPhone from portrait to landscape mode, the size of the content shrinks. For example, the attached image shows the same scenes with the phone in portrait and landscape modes. The blue cube is noticeable smaller in landscape. The size of the cube relative to the vertical space (i.e., the height of the view) in each situation is consistent. Is there a way to keep the scene (e.g., the cube) the same size whether I am in portrait or landscape mode?
1
0
650
Aug ’23
Animating faces
I’m embarking on a new project that will involve animating 3D faces & mouths. I’m looking at using ARFaceAnchors and blendShapes to capture data that will be used to animate the models’ facial expressions. I have a few basic questions: (1) As far as I can tell, Apple has not supported exporting Memojis to rigged 3D models. Is this still the case? (2) I did find one web site that said Apple’s AvatarKit is now public, but everywhere else I’ve checked, it is still a private framework (and Xcode complains). Is AvatarKit still private? (3) It looks like all 52 blendShapes for an ARFaceAnchor are updated every frame, which updates 60 times a second This is 3120 data points per second. Are there any best practice guides to reduce the data? For example, “These 10 blendShapes capture the most important features for animating a face.” (4) It appears that visionOS does not support ARFaceAnchor. If I want to present a remote user as a Memoji (or other rigged model) in a shared experience, is there any way to do that at the current time?
0
0
544
Jul ’23
iOS 16.2: Cannot load underlying module for 'ARKit'
I have a strange warning in Xcode associated with ARKit. It isn't a big issue because there is a work around, but I am curious why this is happening and how I can avoid it in the future. I opened up an old AR project in Xcode, and the editor gave a strange error message on the "import ARKit" line saying Cannot load underlying module for 'ARKit' Despite the error message, the code continues to build and run. I've quit & restarted Xcode, rebooted the Mac, and even deleted and and redownloaded Xcode, but the error/warning was still there. Upon some additional testing, I discovered that I only get this message when targeting "iOS 16.2" but not when targeting 16.0, 16.1, 16.3, or 16.4. (I did not try pre-iOS 16) Any idea why my Xcode no longer likes ARKit on iOS 16.2 on my Mac? Development platform: Xcode: Version 14.3.1 (14E300c) macOS: 13.4 (22F66) Mac: Mac Studio 2022 iOS on iPhone 14 Pro Max: 16.5 (20F66) Screenshot:
2
1
1.2k
Jun ’23
Logging not redacting strings in Xcode
This is probably a minor point because it wouldn't affect distributed binaries, but I thought I'd mention it in case the behavior is unexpected. After watching WWDC 20202 Explore logging in Swift, I tried some simple examples in a Mac command-line app. I was surprised to see the strings were all printed just fine. There was no redaction. At least when running the program from Xcode. (Even using the old os_log() approach showed the strings without needing to add %{public}@.) However, if I run the program from a Terminal shell, the string arguments are properly redacted. I actually like this behavior (showing more while running in Xcode), but I thought I'd just raise the issue. Sample code and screenshot from Console are shown below. import Foundation import os let logger = Logger(subsystem: "com.example.logging_test", category: "hello") let greeting = "Hello" let personName = "World" logger.log("\(greeting), \(personName)") logger.log("\(greeting, privacy: .private), \(personName)") logger.log("\(greeting, privacy: .public), \(personName)") os_log("%@, %@", greeting, personName)
1
0
1.1k
Mar ’23
Can exposureCompensation affect SLAM?
When setting ARView's environment camera feed exposure to a negative value to make the camera feed dimmer, for example arView.environment.background = .cameraFeed(exposureCompensation: -3) can this negatively affect ARKit's ability to track the device's localization and mapping capability? That is, is the device's use of the camera for SLAM purposes independent of the exposureCompensation value?
0
1
642
Feb ’23
AnchorEntity a child of an Entity?
Reviewing Apple's AnchorEntity documentation, I see that an AnchorEntity can be a child of an Entity in the RealityKit hierarchy. Has it always been this way? In my memory, an AnchorEntity was always just the base element in a Scene. If this was a change by Apple at some point, has Apple given examples where making an AnchorEntity a child of an Entity lets you do cool things you couldn't do before?
1
0
767
Feb ’23
Importing RoomPlan output into Blender
I'm sharing this in case someone else wants to use Apple's RoomPlan to create a model and import it into Blender. The problem: I could not successfully import a USDZ model from the RoomPlan app into Blender. (I went through the normal process of importing a USDZ file into Blender: change the file type from ".usdz" to ".zip"; unzipped the file; then tried to import the ".usda" file). No surfaces appeared. The solution: In Apple's source code from here, in the file RoomCaptureViewController.swift, I changed the line try finalResults?.export(to: destinationURL, exportOptions: .parametric) to try finalResults?.export(to: destinationURL, exportOptions: .mesh) recompiled, and went through the USDZ to USDA conversion process again. This time it worked. Apparently Blender cannot import parametric USDA models.
0
0
1.6k
Jan ’23
Body tracking robot and Blender
Has anyone successfully imported Apple's (FBX) robot for Apple's CapturingBodyMotionIn3D demo into Blender exported it back out (GLTF or other format) converted it back to USDZ via Reality Converter and gotten it to work in Apple's demo app again? I have run into numerous problems, and each effort to fix a problem leads to new ones. For example, importing Apple's FBX robot has the bones pointing in funny directions (see attachment). When I try to correct this on import by aligning the bones, the robot in the Apple app looks like it went through Star Trek transporter accident - limbs at weird angles.
0
0
807
Jan ’23
Multiple BodyTrackedEntities?
Can ARKit/RealityKit track multiple bodies for animation simultaneously? Reviewing Apple's CapturingBodyMotionIn3D sample code (for WWDC2019 session), there is no explicit linkage between the ARBodyAnchor and the loaded BodyTrackedEntity (e.g., the AnchorEntity used for the BodyTrackedEntity is not associated with the ARBodyAnchor). There seems to be some hidden linkage between the ARKit ARBodyAnchor and the RealityKit BodyTrackedEntity. Likewise, the ARFrame has a property for only a single ARBody2D (detectedBody). My interpretation is that only a single person can be tracked at a time. Is this correct?
1
0
1.1k
Jan ’23
State of the Union video on iOS or Apple TV
When I try to watch the WWDC22 "Platforms State of the Union" video on my Apple TV, iPhone, or iPad, using the Developer app, the video only shows a single frame every few seconds and there is no audio. The video plays fine on the Developer app on my Mac. Has anyone else had this problem? Is there a work around? (I've tried deleting the Developer app on my Apple TV and reinstalling it, but no joy).
1
0
763
Jan ’23
AnchorEntity from ARAnchor bug?
I am running into a strange bug where the exact same code compiles fine in one project but generates a compiler error in another project. In particular, I am trying to create an AnchorEnity from an ARAnchor. func addModelTo(anchor: ARAnchor) { let entityAnchor = AnchorEntity(anchor: anchor) ... } The compiler error message is not even consistent. Sometimes I get a single error message: Cannot convert value of type 'ARAnchor' to expected argument type 'AnchoringComponent.Target' Other times I get an error with two possible issues: No exact matches in call to initializer Candidate '() -> AnchorEntity' requires 0 arguments, but 1 was provided (RealityFoundation.AnchorEntity) Candidate expects value of type 'AnchoringComponent.Target' for parameter #1 (got '(anchor: ARAnchor)') I'm trying to track down why this sometimes causes an error and sometimes it does not. Any pointers?
1
0
1.4k
Nov ’22
Setting lensPosition to focus at infinity
I am creating a fixed-focus camera app with the focus distance at infinity (or at least 30+ feet away). When I set lensPosition to 1.0, the images were blurry. Some tests letting autofocus do the job showed a lensPosition of about 0.808 for my wide and telephoto lenses and 0.84 for the ultra wide lens did the trick. (iPhone 13 Max) Will the lensPosition to focus at infinity vary between devices and lenses on that device? Is there a way to determine the appropriate lensPosition at run time?
1
0
732
May ’22