I just posted an article on exactly this with a library to take away the boilerplate code and a working example.https://medium.com/@maxxfrazer/realitykit-synchronization-289ba9409a6eAnd the library to do most of the work for you complete with an example project:https://github.com/maxxfrazer/MultipeerHelperHopefully it helps you all!
Post
Replies
Boosts
Views
Activity
Not sure exactly what you're after, but look up "RealityKit Entity Gestures" for information on how to add panning gestures to an entity using installGestures().
I haven't worked on image detection with ARKit in a little while, but as I remember, ARImageTrackingConfiguration is used for actively tracking an image, for example postcard or something that might move. When you use ARWorldTrackingConfiguration this should be reserved for images that will be static in the scene, and you can rely on them to only exist at one location (for example, a poster).The benefit of using ARWorldTrackingConfiguration over ARImageTrackingConfiguration is that ARKit will search for the image less often by assuming that it isn't moving, only the camera is. This will mean you can get away with more complex scenes, wheras ARImageTrackingConfiguration will be wanting to use the CPU as many frames as it can.
Assuming you've set up a UITapGestureRecognizer for your ARView, and have a function that starts like this:@objcfunc handleTap(_ sender: UITapGestureRecognizer? = nil) {Get the CGPoint of that touch:guard let touchInView = sender?.location(in: self.arView) else {
return
}Perform a raycast at that CGPoint:if let result = arView.raycast(
from: touchInView,
allowing: .existingPlaneGeometry, alignment: .horizontal
).first {
print(result.worldTransform)
// worldTransform is of type simd_float4x4
}From there, use this 4x4 matrix to position your entity at the touch location in the scene using moveTo.
I think the easiest way would be to import your model and animation to blender (or similar), export it as a glb file, and then convert to USDZ. The glb file bundles the model and animations together.
hi, Using the method HasTransform.look() has worked great for me, you just have to make sure the direction RealityKit thinks is forward on your model is the same direction you do. https://developer.apple.com/documentation/realitykit/hastransform/3244204-look You just set ‘at’ to be the entity’s position, and ‘to’ should be the camera transform’s position.
Yes exactly 🚀
It should be straightforward to fix those issues, try something like this:entity.look(at: camera.transform, from: entity.position, upVector: [0, 1, 0])It does depend on your entity to not be a child of something else in the scene with a non identity transform. Otherwise you have to find the camera transform relative to the entity's parent, there are several conversion methods in the HasTransform docs here.As a side note, I noticed that this look() function also scales your object, so watch out for that if you have an object with a non [1,1,1] scale.
I typically set the frame to zero when making UIKit apps too (non SwiftUI), and later set the frame like this:self.arView.frame = self.view.bounds
self.arView.autoresizingMask = [.flexibleWidth, .flexibleHeight]I'm not extremely knowledgeable about UIKit in general, but this works for me to make the view scale to fit the screen accordingly no matter the orientation.
Hi,Yes, using scene.anchors.append(myAnchor) has the same effect, and has the added bonus of being able to add multiple independent anchors at the same time by using scene.anchors.append(contentsOf: [Scene.AnchorCollection.Element]).I don't know of any reason this should or should not need to be utilised. Only that to me, scene.addAnchor() seems like a nicer looking way to achieve the same thing, thus may be what the RealityKit team is expecting us to use.
I saw your question on twitter, but I'll respond here with more details…Once you've loaded your ModelEntity from a USDZ, find the model.materials list and replace the material you want to replace with a SimpleMaterial or UnlitMaterial.To find which material you need to replace, depending on how your USDZ is structured it could look like this:https://imgur.com/a/bWyjVlGIn which case you can see that if you want to replace the sail it'll be model.materials[0], model.materials[4] for the handrails.If you cannot view all the materials like this, for example if you have multiple geometries in your USDZ, then you can try deduce based on the order of the geometries in your usdz file, or loop through some colours, replacing each material in the material array like this:let myColors: [UIColor] = [.red, .orange, .blue, .yellow, ...]
model.materials = myColors.map { SimpleMaterial(color: $0, isMetallic: false) }Also make sure sure that your array of colors is the same size as your original model.materials array, not sure what a ModelComponent does if you give it too many materials. (it likely just ignores them)And then just see what color the mesh shows up on to deduce the index.
It might be helpful if you post your code, so someone may be able to spot where the 0.1cm error is coming from.I'm expecting that your second animation is starting just before the first has ended.Try using the AnimationEvents.PlaybackCompleted Event before firing the second animation if you're not already. Otherwise if you're calculating based on `entity.position`, then adding or subtracting 2, instead you could save the starting position so the entity moves back to there instead.PS I'm assuming you're using move(to: Transform), rather than anything else, as move(by:) doesn't exist in RealityKit.
I don't see any reason why you wouldn't be able to do that. You can't interface directly with RealityKit from C++, but no reason that you can't run some calculations with C++ and read that back into your Swift code.
Bobjt has answered the bulk of your question; but for getting the rendered scene as a texture this method might be what you want:https://developer.apple.com/documentation/realitykit/arview/3255319-snapshot
macOS doesn't have ARKit, this is only available on iOS and iPadOS. ARKit is required for plane and face tracking.