Post

Replies

Boosts

Views

Activity

Please, restore VisionOS simulator for Intel-based Macs.
Just a few days ago, I found the following notes for the visionOS 1 beta 4 Simulator: This includes the simulator runtime for visionOS. The simplest way to install the visionOS simulator runtime is by using Xcode. Xcode can automatically install it when you build a project or start a new one. To manually add this simulator to Xcode, read Installing Additional Simulator Runtimes. Note: Developing for visionOS requires a Mac with Apple silicon. I'm currently the proud owner of an Intel-based iMac 2019, 27", updated to the latest MacOS version, and which works wonderfully, flawlessly. And I paid handsomely for it. And now, if I am ever going to continue working on any projects related to VisionOS, I'm going to have to shell extra money (a bit less than what I paid for my current setup) for a machine that will be marginally better than my actual one. While I do realize there must be technical reasons why Apple Silicon-based machines perform better than Intel-based ones for VisionOS development, I still can't help but notice that, until a week ago, these machines worked almost correctly - there were some bugs, but they weren't showstoppers. And I'm pretty sure these bugs can be fixed. So, I do not really understand why do you guys want to make this move to stop supporting Intel-based Macs from doing VisionOS dev. It will only force people to purchase machines that they do not need, to make the apps that they want to do, leaving perfectly working machines as if they were useless (which, I cannot state strongly enough, THEY WORK PERFECTLY.) Dear Apple: Please, do not force us to buy new Macs to do our work. Please, allow us to have the VisionOS Simulator on our existing machines, and let us experience any problem that the Simulator has in our machines by ourselves, and then make the decision to get newer hardware at our pace. It should be our decision, not yours. Please, restore the VisionOS Simulator, bugs and all, and let us decide by ourselves. Feel free to decide whether it is worth to fix the current bugs at your own pace, but please, do not forbid us from doing our job with our current machines.
12
14
3.5k
Oct ’23
Distance to ground - is it available in VisionOS?
Allow me to explain. Examining the data generated by calls in a project I recently created, which mimics the code that is described in https://developer.apple.com/documentation/compositorservices/drawing_fully_immersive_content_using_metal , I noticed that, when running in the VisionOS simulator, the contents of the matrix returned by the following line: simd_float4x4 head_pose = ar_pose_get_origin_from_device_transform(pose); the translation components of the matrix are returned as (0,0,0) if you do not move the camera within the simulator. Which means that, at least for the simulator, the user's head is initially at the origin of the world coordinate system. There doesn't seem to be any indication that the headset is at a specific distance from the floor of the room that the simulator displays. That value is kind of important, if you want to convey in a fully-immersive scene that you (the user) are standing at a specific spot (and distance) from the floor, in a believable, non VR-sickness inducing way, by taking into account your own height. Is this "distance to ground" value available somewhere within the vast array of APIs exposed by the Vision Pro headset? If so, where should I look for? Or is this something I'm supposed to work on, by using ARKit or some other API, by myself?
0
0
382
Aug ’23
Show / hide keyboard while in Immersive mode without Views / WindowGroups?
Most of the examples I have seen out there assume that, to enable or disable the keyboard in iOS / iPadOS / whatever OS that supports an on-screen keyboard, you must have some kind of View component available, which you will then modify to make the keyboard appear or disappear. In fully immersive apps, however, we won't always have WindowGroups visible on screen, as the style of such UI elements would clash with the visual style of the immersive space being shown. Hence the question. Is it possible at all?
0
0
375
Aug ’23
Some buttons not recognized by GCController from Xbox One controller in VisionOS project only
I was not sure if this deserved posting into Feedback Assistant as a bug, or if it's something I'm not configuring correctly. That's why I'm posting it here first. After connecting an Xbox One wireless controller (the basic, black version) to the VisionOS simulator (beta 2), running a sample project I created to read the controller's input, I noticed that the following buttons do not seem to be recognized by it: Home button Options button Thumbstick left button Thumbstick right button All other input seems to work just fine. Am I missing something I need to configure, specific to VisionOS, to make these buttons generate input in the app? Needless to say, the missing buttons are actually recognized in another sample app I created, this time for MacOS using SwiftUI, and with virtually the same code files. The gist for the VisionOS app is here: https://gist.github.com/Izhido/6baa69fb72540b0f0bffa2f8c991c0ed Replace the contents of the template VisionOS project with the files in the gist. Include GameController.framework in "Build Phases"-"Link Binary with Libraries". Add "Supports Controller User Interaction", "Supported game controller types" with "ProfileName" = "ExtendedGamepad" in "Info" in the project. Run the modified project in Xcode. Ensure that the "I/O"-"Input"-"Send Game Controller to Device" option is turned on in the simulator. Press the "Log game controller" button in the presented window. Monitor the messages in the Console in Xcode as you press the buttons. You will notice there is nothing logged for the buttons above mentioned, while all other inputs are correctly logged. Also, if needed, the MacOS project that works correctly can be provided upon request.
1
0
484
Jul ’23
SIGABRT on throw instruction in C++ code file of VisionOS project in Xcode 15 beta
I'm getting a weird crash from the template VisionOS app (the one that appears upon creating a new VisionOS project). I went and modified the code to use two additional C++ files, one of which throws an exception upon a specific circumstance, and another one that catches and handles the exception. If I attempt to run the app with the new code, instead of the code catching the exception, I get a SIGABRT signal, as if C++ exceptions were not enabled at project level. The following gist contains a minimal example with the weird behavior: https://gist.github.com/Izhido/100a92f45aaf8bacffe73893d6109077 Replace the contents of the template VisionOS project with these files, run the project, and press the "Do sum" button. Xcode 15 beta will report a SIGABRT signal at sumofnumbers_impl.cpp, line 8. What am I missing here? (Incidentally, the same code in a MacOS project runs just fine - I can share the project upon request. Also, for some reason I cannot share screenshots or files in this forum - that's why I provided the gist.)
7
0
1.2k
Jul ’23