Post

Replies

Boosts

Views

Activity

Reply to ARAnchor subclass causing issues when worldmap saved/restored
Hi - As others have mentioned, you need to properly define how your custom properties will be encoded and decoded. Not doing so will crash the app when trying to archive the data as you've described. Here is a quick example of a custom ARAnchor that can successfully be archived and unarchived in your saved world data: class CustomAnchor: ARAnchor {       // Your added property   var customProperty: String       // Init + your custom property   init(name: String, transform: simd_float4x4, customProperty: String) {     self.customProperty = customProperty     super.init(name: name, transform: transform)   }       override class var supportsSecureCoding: Bool {     return true   }       required init?(coder aDecoder: NSCoder) {     // Try to decode and initialize your custom value based on the key you set in your encoder     if let customProperty = aDecoder.decodeObject(forKey: "customProperty") as? String {       self.customProperty = customProperty     } else {       return nil     }           super.init(coder: aDecoder)   }       // As others have mentioned - this is required to maintain your custom properties as ARKit refreshes   required init(anchor: ARAnchor) {     let other = anchor as! CustomAnchor     self.customProperty = other.customProperty     super.init(anchor: anchor)   }       // Encode your custom property using a key to be decoded   override func encode(with aCoder: NSCoder) {     super.encode(with: aCoder)     aCoder.encode(customProperty, forKey: "customProperty")   }     } This code is a slimmed down version of one of Apple's examples creating a custom anchor to store "snapshot" data in their persistence demo app: https://developer.apple.com/documentation/arkit/data_management/saving_and_loading_world_data
Jul ’21
Reply to RealityKit saving changed scale/rotation gesture values after installGestures
Hi fullstackdev, To expand on the above answers, the EntityGestureRecognizer can be utilized and handled like any other that conforms to UIGestureRecognizer. When you add gestures to your ARView for your entity, you can add an objc function as its target, and access the gesture's states. Please consider the following code: arView.installGestures(.all, for: clonedEntity).forEach { entityGesture in entityGesture.addTarget(self, action: #selector(handleEntityGesture(_:))) } Then our target function would look something like this: @objc func handleEntityGesture(_ sender: UIGestureRecognizer) { // Scale if let scaleGesture = sender as? EntityScaleGestureRecognizer { switch scaleGesture.state { case .began: // Handle things that happen when gesture begins case .changed: // Handle things during the gesture case .ended: // Handle things when the gesture has ended default: return } } } In the .ended case, you can grab the entity's scale like so, and do whatever you want with it. If you need to store it somewhere, consider passing it into some other type of function: case .ended: let scale = scaleGesture.entity?.transform.scale // Do what you want with the scale... You can also access the other gesture recognizers by casting the gesture as either EntityTranslationGestureRecognizer, or EntityRotationGestureRecognizer and access their states similarly like the above. Hope this helps.
Jul ’21
Reply to UIFeedbackGenerator is not working when ARKit Configuration "providesAudioData" is true
Hey kyh951019, Whenever there is any active AVCaptureSession configured to capture audio, iOS automatically disables haptics as well as other system sounds by default to prevent them from being recorded. ARSession runs its own internal AVCaptureSession to capture the camera, as well as the microphone when providesAudioData = true. Unfortunately, that internal session is not exposed through any of the APIs, which prevents us from overriding. Otherwise, you'd be able to do something like AVAudioSession.sharedInstance().setAllowHapticsAndSystemSoundsDuringRecording(true)
May ’21
Reply to RealityKit: Scale generalText based on distance from Anchor
Hey LT13 - I had to do something similar recently, and I solved it by checking the distance from the camera to my entity in the session delegate didUpdate function, and setting the scale based on the distance. Please consider the following example below: func session(_ session: ARSession, didUpdate frame: ARFrame) {   for entity in self.arView.scene.anchors {     if case let AnchoringComponent.Target.world(transform) = entity.anchoring.target {       let distance = distance(transform.columns.3, frame.camera.transform.columns.3)       entity.scale = .one * (4 * sqrtf(distance))     }   } } You can change the entity.scale formula to work however you'd like. For me, it sets an initial scale for the entity and will scale it relative to the distance between the camera and the entity.
May ’21