Coordinate conversion was mentioned in https://developer.apple.com/wwdc24/10153 (the effect is demonstrated at 22:00), in which the demonstration is an entity that jumps out of volume into space, but I don't understand his explanation very well. I hope you can give me a basic solution. I am very grateful for this!
General
RSS for tagDiscuss Spatial Computing on Apple Platforms.
Post
Replies
Boosts
Views
Activity
This effect was mentioned in https://developer.apple.com/wwdc24/10153 (the effect is demonstrated at 28:00), in which the demonstration is you can add coordinates by looking somewhere on the ground and clicking., but I don't understand his explanation very well. I hope you can give me a basic solution. I am very grateful for this!
I have three basic elements in this UI page: View, Alert, Toolbar. I put Toolbar and Alert along with the View, when I click a button on Toolbar, my alert window shows up. Below could be a simple version of my code:
@State private var showAlert = false
HStack {
// ...
}
.alert(Text("Quit the game?"), isPresented: $showAlert) {
MyAlertWindow()
} message: {
Text("Description text about this alert")
}
.toolbar {
ToolbarItem(placement: .bottomOrnament) {
MyToolBarButton(showAlert: $showAlert)
}
}
And in MyToolBarButton I just toggle the binded showAlert variable to try to open/close the alert window.
When running on either simulator or device, the bahavior is quite strange. Where when toggle MyToolBarButton the alert window takes like 2-3 seconds to show-up, and all the elements on the alert window is grayed out, behaving like the whole window is losing focus. I have to click the moving control bar below (by dragging gesture) to make the whole window back to focus.
And this is not the only issue, I also find MyToolBarButton cannot be pressed to close the alert window (even thogh when I try to click the button on my alert window it closes itself).
Oh btw I don't know if this may affect but I open the window with my immersive view opened (though I tested it won't affect anything here)
Any idea of what's going on here?
XCode 16.1 / visionOS 2 beta 6
Hi I have 2 views and an Immersive space. 1st and 2nd views are display in a TabView I open my ImmersiveSpace from a button in the 1st view of the tab. Then When I go to 2nd TabView I want to show an attachment in my Immersive space. This attachment should be visible in Immersive space only as long as the user os on the 2nd view. This is what I have done so far
struct Second: View {
@StateObject var sharedImageData = SharedImageData()
var body: some View {
VStack {
// other code
} .onAppear() {
Task {
sharedImageData.shouldCameraButtonShouw = true
}
}
.onDisappear() {
Task {
sharedImageData.shouldCameraButtonShouw = false
}
}
}
}
This is my Immersive space
struct ImmersiveView: View {
@EnvironmentObject var sharedImageData: SharedImageData
var body: some View {
RealityView { content, attachments in
// some code
} update: { content, attachments in
guard let controlCenterAttachmentEntity =
attachments.entity(for: Attachments.controlCenter) else { return }
controlCenterentity.addChild(controlCenterAttachmentEntity)
content.add(controlCenterentity)
} attachments: {
if sharedImageData.shouldCameraButtonShouw {
Attachment(id: Attachments.controlCenter) {
ControlCenter()
}
}
}
}
}
And this is my Observable class
class SharedImageData: ObservableObject {
@Published var takenImage: UIImage? = nil
@Published var shouldCameraButtonShouw: Bool = false
}
My problem is, when I am on Second view my attachment never appears. Attachment appears without this if condition. But How can I achieve my goal?
Hello,
I downloaded the most recent Xcode 16.0 beta 6 along with the example project located here
Currently I am experiencing the following build failures:
RealityAssetsCompile
...
error: [xrsimulator] Component Compatibility: BlendShapeWeights not available for 'xros 1.0', please update 'platforms' array in Package.swift
error: [xrsimulator] Component Compatibility: EnvironmentLightingConfiguration not available for 'xros 1.0', please update 'platforms' array in Package.swift
error: [xrsimulator] Component Compatibility: AudioLibrary not available for 'xros 1.0', please update 'platforms' array in Package.swift
error: [xrsimulator] Exception thrown during compile: compileFailedBecause(reason: "compatibility faults")
error: Tool exited with code 1
I saw that there is a similar issue reported. As a test I downloaded that project compiled as expected.
Hello,
I'm trying to load a video on immersive space.
Actually using VideoMaterial and applying it on a plane surface I'm able to load video that I have locally.
If I want to load something from an external link like youtube or other service how can I do that?
Remembering obv that I'm loading it in an immersive space.
Thanks ;)
I downloaded Xcode 16 and updated my macOS to 15, but I keep getting this error when trying to build the game in simulator or in device
[xrsimulator] Exception thrown: The operation couldn’t be completed. (realitytool.RKAssetsCompiler.RKAssetsCompilerError error 3.)
The following RealityView ModelEntity animated text works in visionOS 1.0. In visionOS 2.0, when running the same piece of code, the model entity move duration does not seem to work. Are there changes to the way it works that I am missing? Thank you in advance.
RealityView { content in
let textEntity = generateMovingText()
content.add(textEntity)
_ = try? await arkitSession.run([worldTrackingProvider])
} update: { content in
guard let entity = content.entities.first(where: { $0.name == .textEntityName}) else { return }
if let pose = worldTrackingProvider.queryDeviceAnchor(atTimestamp: CACurrentMediaTime()) {
entity.position = .init(
x: pose.originFromAnchorTransform.columns.3.x,
y: pose.originFromAnchorTransform.columns.3.y,
z: pose.originFromAnchorTransform.columns.3.z
)
}
if let modelEntity = entity as? ModelEntity {
let rotation = Transform(rotation: simd_quatf(angle: -.pi / 6, axis: [1, 0, 0])) // Adjust angle as needed
modelEntity.transform = Transform(matrix: rotation.matrix * modelEntity.transform.matrix)
let animationDuration: Float = 60.0 // Adjust the duration as needed
let moveUp = Transform(scale: .one, translation: [0, 2, 0])
modelEntity.move(to: moveUp, relativeTo: modelEntity, duration: TimeInterval(animationDuration), timingFunction: .linear)
}
}
The source is available at the following:
https://github.com/Sebulec/crawling-text
I'm building a VisionOS 2.0 app where the AVP user can change the position of the end effector of a robot model, which was generated in Reality Composer Pro (RCP) from Primitive Shapes. I'd like to use an IKComponent to achieve this functionality, following example code here. I am able to load my entityy and access its MeshResource following the IKComponent example code, but on the line
let modelSkeleton = meshResource.contents.skeletons[0]
I get an error since my MeshResource does not include a skeleton.
Is there some way to directly generate the skeleton with my entities in RCP, or is there a way to add a skeleton generated in XCode to an existing MeshResource that corresponds to my entities generated in RCP? I have tried using MeshSkeletonCollection.insert() with a skeleton I generated in XCode, but I cannot figure out how to assign this skeletonCollection to the MeshResource of the entity.
Hi, I would like to add a topbar in the panel in VisionOS by using ToolbarTitleMenu, reffering to the document: https://developer.apple.com/documentation/swiftui/toolbartitlemenu,
but in the simulator, I cannot see the topbar, what's wrong with my code?
when I try to import CompositorServices, I get an error for:
dyld[596]: Symbol not found: _$sSo13cp_drawable_tV18CompositorServicesE17computeProjection37normalizedDeviceCoordinatesConvention9viewIndexSo13simd_float4x4aSo0A26_axis_direction_conventionV_SitF
Referenced from: /private/var/containers/Bundle/Application/33008953-150D-4888-9860-28F41E916655/VolumeRenderingVision.app/VolumeRenderingVision.debug.dylib
Expected in: <968F7985-72C8-30D7-866C-AD8A1B8E7EE6> /System/Library/Frameworks/CompositorServices.framework/CompositorServices
The app wrongly refers to my Mac's local directory. However, I chose Vision Pro as a running device. My Mac has been updated to macOS 15 beta 7, and I have not had the same issue before.
Hello,
I am developing a VisionOS based application, that uses the various data providers like Image Tracking, Plane Detection, Scene Reconstruction but these are not supported on VisionOS Simulator. What is the Work Around for this issue ?
Hi everyone,
I'm looking for a way to convert an FBX file to USDZ directly within my iOS app. I'm aware of Reality Converter and the Python USDZ converter tool, but I haven't been able to find any documentation on how to do this directly within the app (assuming the user can upload their own file). Any guidance on how to achieve this would be greatly appreciated.
I've heard about Model I/O and SceneKit, but I haven't found much information on using them for this purpose either.
Thanks!
Here's a video clearly demonstrating the problem:
https://youtu.be/-IbyaaIzh0I
This is a major issue for my game, because it's not meant to be played multiple times. My game is designed to only play once, so it really ruins the experience if it runs poorly until someone force quits or crashes the game.
Does anyone have a solution to this, or has encountered this issue of poor initial launch performance?
I made this game in Unity and I'm not sure if this is an Apple issue or a Unity issue.
The sample code project Tabletopkit Sample found at the article Creating tabletop games here fails to compile with the following errors in Xcode 16 beta 6.
error: [xrsimulator] Component Compatibility: Billboard not available for 'xros 1.0', please update 'platforms' array in Package.swift
error: [xrsimulator] Exception thrown during compile: compileFailedBecause(reason: "compatibility faults")
error: Tool exited with code 1
Is it possible to get the location of where a user uses the tap gesture on the screen? Like an x/y coordinate that can be used within my app.
I know there is spatialTapGesture but from what I can tell that is only linked to content entities like a cube or something. Does this mean I can only get the x/y coordinate data by opening an immersive space and using the tap gesture in relation to some entity?
TLDR: Can I get the location of a tap gesture in Vision OS in a regular app, without opening an immersive space?
I`m want to learn how to use RealityKit Debugger Tool in the website https://developer.apple.com/wwdc24/10172 . The video say downloads the ClubView.Swift file, so where it is?
The sample code project PencilKitCustomToolPicker found at this article Configuring the PencilKit tool picker here fails to compile with the following errors in Xcode 16 beta 6
The sample code project RealityKit-Stereo-Rendering found at the article Rendering a windowed game in stereo here fails to compile with the following errors in Xcode 16 beta 6.
Is it possible to show a SafariWebView in an ImmersiveSpace with hand tracking enabled?
I have an app with an initialView that launches an immersive space and opens a SafariView. I noticed that hand tracking stops working when I open the SafariView, but not when I open the TestView (which is just an empty window).
Here's the Scene:
var body: some Scene {
WindowGroup(id: "control") {
InitialView()
}.windowResizability(.contentSize)
WindowGroup(id: "test") {
TestView()
}.windowResizability(.contentSize)
WindowGroup(id: "safari") {
SafariView(url: URL(string: "some URL")!)
}
ImmersiveSpace(id: "immersiveSpace") {
ImmersiveView()
}
}