You might be better off subclassing UIPanGestureRecognizer instead and building out the movement from there - I’ve done this a few times and it does work well.
Post
Replies
Boosts
Views
Activity
Add an ARCoachingOverlayView - https://developer.apple.com/documentation/arkit/arcoachingoverlayview to your View hierarchy and set the goal to .horizontalPlane.
As you’re currently using an example that uses RealityKit, there is no way to change the gravity value in there at present. The other answer from @ahzzheng is only relevant for SceneKit.
—————
This might change after WWDC tomorrow.
The only ways I know of getting a more accurate positioning is using tracking images which are placed in specific locations, using a visual search system like Google Maps uses, or by using WiFi data.
Google has not opened their API for developers unfortunately, and the only company I know that uses WiFi for positioning is called Dent Reality. But again, not accessible to other developers.
Do you mean planning how to go from point A to B, to see a visible path to follow, or something else entirely?
If either of the first two, there are open source projects on GitHub doing these with SceneKit
Not sure what I was doing wrong here originally (possibly not enabling collaborative session - https://developer.apple.com/documentation/arkit/arworldtrackingconfiguration/3152987-iscollaborationenabled), but I have since created a Swift Package on GitHub to achieve this really easily.
https://github.com/maxxfrazer/MultipeerHelper
In the SwiftUI example they use `.zero` for the frame, might be best to just use the same as it will take it to full screen either way.For the box, in order to avoid any issues, just create and add a box manually instead, also the SwiftUI Preview will basically use the simulator, and the initialiser I mentioned is actually not available for the simulator, you'll need to put in a targetEnvironment check:struct ARViewContainer: UIViewRepresentable {
func makeUIView(context: Context) -> ARView {
#if targetEnvironment(simulator)
let arView = ARView(frame: .zero)
#else
let arView = ARView(frame: .zero, cameraMode: .nonAR, automaticallyConfigureSession: false)
#endif
let newAnchor = AnchorEntity(world: [0, 0, -1])
let newBox = ModelEntity(mesh: .generateBox(size: 0.3))
newAnchor.addChild(newBox)
arView.scene.anchors.append(newAnchor)
return arView
}
func updateUIView(_ uiView: ARView, context: Context) {}
}The simulator automatically uses cameraMode .nonAR, as it has no other choice and cannot be changed to `.ar`.
Your scene may be empty, in which case it would be normal to see nothing. Add a cube to the position [0, 0, -1], it should appear in the middle of your view, since the default camera should be looking at [0,0,-1] from [0,0,0].If you're changing the camera mode from .ar to .nonAR, and don't plan on actually using the session for anything, then it might be best to run arView.session.pause() after, instead of leaving it.A better (IMO) way to initialize a nonAR ARView would be using this initializer, but if you're using storyboards then just changing the camera mode should work just as well. Also, if you want to change the background color, set this parameter:self.arView.environment.background = .color(.blue)
As mentioned, if your target is simulator or not iOS you can not specify the cameraMode, it will go straight to nonAR. The only initialiser will just take the frame I believe.Only if you are building for an iOS device (not simulator) can you use the initialiser with cameraMode.#if os(iOS) && !targetEnvironment(simulator)
arView.cameraMode = .nonAR
#endifNo need to recreate the ARView, just changing the camera mode should be fine.If you are still stuck it'd be useful to know what your build target is.
Make sure you’re building for a target iOS device, and not the simulator. Only on iOS can you specify the cameraMode, Simulator and macOS can only work with nonAR.
macOS doesn't have ARKit, this is only available on iOS and iPadOS. ARKit is required for plane and face tracking.
Bobjt has answered the bulk of your question; but for getting the rendered scene as a texture this method might be what you want:https://developer.apple.com/documentation/realitykit/arview/3255319-snapshot
I don't see any reason why you wouldn't be able to do that. You can't interface directly with RealityKit from C++, but no reason that you can't run some calculations with C++ and read that back into your Swift code.
It might be helpful if you post your code, so someone may be able to spot where the 0.1cm error is coming from.I'm expecting that your second animation is starting just before the first has ended.Try using the AnimationEvents.PlaybackCompleted Event before firing the second animation if you're not already. Otherwise if you're calculating based on `entity.position`, then adding or subtracting 2, instead you could save the starting position so the entity moves back to there instead.PS I'm assuming you're using move(to: Transform), rather than anything else, as move(by:) doesn't exist in RealityKit.
I saw your question on twitter, but I'll respond here with more details…Once you've loaded your ModelEntity from a USDZ, find the model.materials list and replace the material you want to replace with a SimpleMaterial or UnlitMaterial.To find which material you need to replace, depending on how your USDZ is structured it could look like this:https://imgur.com/a/bWyjVlGIn which case you can see that if you want to replace the sail it'll be model.materials[0], model.materials[4] for the handrails.If you cannot view all the materials like this, for example if you have multiple geometries in your USDZ, then you can try deduce based on the order of the geometries in your usdz file, or loop through some colours, replacing each material in the material array like this:let myColors: [UIColor] = [.red, .orange, .blue, .yellow, ...]
model.materials = myColors.map { SimpleMaterial(color: $0, isMetallic: false) }Also make sure sure that your array of colors is the same size as your original model.materials array, not sure what a ModelComponent does if you give it too many materials. (it likely just ignores them)And then just see what color the mesh shows up on to deduce the index.