Post

Replies

Boosts

Views

Activity

Reply to Inquiries about the difference between isTracked in ARWorldConfiguration and ARImageConfiguration(ARKit)
I haven't worked on image detection with ARKit in a little while, but as I remember, ARImageTrackingConfiguration is used for actively tracking an image, for example postcard or something that might move. When you use ARWorldTrackingConfiguration this should be reserved for images that will be static in the scene, and you can rely on them to only exist at one location (for example, a poster).The benefit of using ARWorldTrackingConfiguration over ARImageTrackingConfiguration is that ARKit will search for the image less often by assuming that it isn't moving, only the camera is. This will mean you can get away with more complex scenes, wheras ARImageTrackingConfiguration will be wanting to use the CPU as many frames as it can.
Dec ’19
Reply to RealityKit and Coordinates
Assuming you've set up a UITapGestureRecognizer for your ARView, and have a function that starts like this:@objcfunc handleTap(_ sender: UITapGestureRecognizer? = nil) {Get the CGPoint of that touch:guard let touchInView = sender?.location(in: self.arView) else { return }Perform a raycast at that CGPoint:if let result = arView.raycast( from: touchInView, allowing: .existingPlaneGeometry, alignment: .horizontal ).first { print(result.worldTransform) // worldTransform is of type simd_float4x4 }From there, use this 4x4 matrix to position your entity at the touch location in the scene using moveTo.
Jan ’20
Reply to Rotating an Entity towards the camera transform
hi, Using the method HasTransform.look() has worked great for me, you just have to make sure the direction RealityKit thinks is forward on your model is the same direction you do. https://developer.apple.com/documentation/realitykit/hastransform/3244204-look You just set ‘at’ to be the entity’s position, and ‘to’ should be the camera transform’s position.
Mar ’20
Reply to Rotating an Entity towards the camera transform
It should be straightforward to fix those issues, try something like this:entity.look(at: camera.transform, from: entity.position, upVector: [0, 1, 0])It does depend on your entity to not be a child of something else in the scene with a non identity transform. Otherwise you have to find the camera transform relative to the entity's parent, there are several conversion methods in the HasTransform docs here.As a side note, I noticed that this look() function also scales your object, so watch out for that if you have an object with a non [1,1,1] scale.
Mar ’20
Reply to Why is arView = ARView(frame: .zero) set to zero?
I typically set the frame to zero when making UIKit apps too (non SwiftUI), and later set the frame like this:self.arView.frame = self.view.bounds self.arView.autoresizingMask = [.flexibleWidth, .flexibleHeight]I'm not extremely knowledgeable about UIKit in general, but this works for me to make the view scale to fit the screen accordingly no matter the orientation.
Mar ’20
Reply to scene.addAnchor( ) vs scene.anchors.append( )
Hi,Yes, using scene.anchors.append(myAnchor) has the same effect, and has the added bonus of being able to add multiple independent anchors at the same time by using scene.anchors.append(contentsOf: [Scene.AnchorCollection.Element]).I don't know of any reason this should or should not need to be utilised. Only that to me, scene.addAnchor() seems like a nicer looking way to achieve the same thing, thus may be what the RealityKit team is expecting us to use.
Apr ’20
Reply to Change Material base color
I saw your question on twitter, but I'll respond here with more details…Once you've loaded your ModelEntity from a USDZ, find the model.materials list and replace the material you want to replace with a SimpleMaterial or UnlitMaterial.To find which material you need to replace, depending on how your USDZ is structured it could look like this:https://imgur.com/a/bWyjVlGIn which case you can see that if you want to replace the sail it'll be model.materials[0], model.materials[4] for the handrails.If you cannot view all the materials like this, for example if you have multiple geometries in your USDZ, then you can try deduce based on the order of the geometries in your usdz file, or loop through some colours, replacing each material in the material array like this:let myColors: [UIColor] = [.red, .orange, .blue, .yellow, ...] model.materials = myColors.map { SimpleMaterial(color: $0, isMetallic: false) }Also make sure sure that your array of colors is the same size as your original model.materials array, not sure what a ModelComponent does if you give it too many materials. (it likely just ignores them)And then just see what color the mesh shows up on to deduce the index.
Apr ’20
Reply to Move, Rotate, Scale By
It might be helpful if you post your code, so someone may be able to spot where the 0.1cm error is coming from.I'm expecting that your second animation is starting just before the first has ended.Try using the AnimationEvents.PlaybackCompleted Event before firing the second animation if you're not already. Otherwise if you're calculating based on `entity.position`, then adding or subtracting 2, instead you could save the starting position so the entity moves back to there instead.PS I'm assuming you're using move(to: Transform), rather than anything else, as move(by:) doesn't exist in RealityKit.
Apr ’20