Posts

Post not yet marked as solved
4 Replies
1.5k Views
I have built a simple reality composer project. It is one object and on scene start, it plays some music. And when you tap on an object, it plays a sound. works exactly as expected in reality composer in the preview area. And it works as expected in Xcode. However, when I export it as a USDZ, so that I can easily share it, it only lets me see the object and place and scale it in AR. It doesn't play the audio or play on tap as expected.Should this work?I did notice on my iPad that there is a tap to activate button at the top of the screen, but it either doesn't do anything or it just brings up the area to view it as object or AR or share it.thanks for any help or clarification.Dale
Posted Last updated
.
Post not yet marked as solved
0 Replies
739 Views
How can I adjust the bounding box for object capture reconstruction so that it just constructs the object and not all the surrounding data? I think it is through providing a bounding box to PhotogrametrySession.Request.modelFile(url:detail:geometry) but I can't seem to figure out the proper way to do it.
Posted Last updated
.
Post not yet marked as solved
3 Replies
1.3k Views
I am doing an image tracking Reality Composer experience. Currently, the ground plane in Reality Composer clips the object as it intersects the ground. Is there a way to turn that "feature" off? or to be able to raise the image tracker to be above the ground instead of on the ground? https://www.dropbox.com/s/hbahcpcr4rmnl5p/screenCapture.jpg?dl=0 I would also very much like to be able to turn off people and space occlusion on usdz and .reality files, but I can't seem to do that. thanks for any ideas/help. Dale
Posted Last updated
.
Post not yet marked as solved
1 Replies
926 Views
I have a client that wants to broadcast an airdrop file/image to any available open phones that can receive an image. Is that possible? Can I broadcast an image like that with Air Drop?thanks,Dale
Posted Last updated
.
Post not yet marked as solved
1 Replies
824 Views
I Want to make an entity move. I have added an onTap function that now works and registers touches. But I can't seem to make the entity move. I copied this code from session 605@IBAction func onTap(_ sender: UITapGestureRecognizer) { let tapLocation = sender.location(in: arView) print("I am here at onTap") // Get the entity at the location we've tapped, if one exists if let object = arView.entity(at: tapLocation) { // For testing purposes, print the name of the tapped entity print(object.name) // Add interaction code here var flipUpTransform = object.transform flipUpTransform.rotation = simd_quatf(angle: .pi, axis: [1, 0, 0]) let flipUpController = object.move(to: flipUpTransform, relativeTo: object.parent, duration: 2.0, timingFunction: .easeInOut) print("I think I just moved") flipUpController.completionHandler { // box is finished flipping } } }
Posted Last updated
.
Post marked as solved
1 Replies
804 Views
I am experimenting with the code in Session 605I want to see what I am tapping and next I want to add animation.I am using this code:@IBAction func onTap(_ sender: UITapGestureRecognizer) { let tapLocation = sender.location(in: arView) print("I am here at onTap") // Get the entity at the location we've tapped, if one exists if let object = arView.entity(at: tapLocation) { // For testing purposes, print the name of the tapped entity print(object.name) // Add interaction code here } }I have generated collisions with this:boxAnchor.generateCollisionShapes(recursive: true)I can get behaviors to work from reality composer, but I don't get anything back from the onTap action. How do I get things responding in the onTap function so that I can add code based animation, etc.?thanks,Dale
Posted Last updated
.
Post not yet marked as solved
1 Replies
724 Views
Does sample code exist for some of the WWDC sessions? For example, Session 605 for reality kit? It would be extremely helpful to be able to download the sample code for the experience that he is showing. He shows code snippets in the talk but it would be helpful to be able to download the project. Does anyone know if that is available somewhere?thanks,Dale
Posted Last updated
.
Post not yet marked as solved
0 Replies
570 Views
Is it possible to get the sample code from WWDC 2019 - Session 609?I am trying to get everything working for notifications and actions and would love to inspect that code.thanks,Dale
Posted Last updated
.
Post not yet marked as solved
2 Replies
5.4k Views
can we use the new SwiftUI with ARKit and RealityKit and composer?thanks,Dale
Posted Last updated
.
Post marked as solved
1 Replies
785 Views
I have created a reality composer scene with tap actions. it works in reality composer on mac and ios. Those actions don't work once I load it in xcode. Do I have to separately code each action that I have already made in reality composer?I understand that the notify action let's me make custom actions, but I thought that I would be able to automatically use the actions and behaviors made in reality composer.thanks for any info.Dale
Posted Last updated
.
Post not yet marked as solved
0 Replies
666 Views
I have a model with subdiv surfaces. When I convert it to USDZ, it doesn't display with subdivision. I can see that the apple examples have beautiful subdivision. If I import it into Xcode-beta, I can enable the subdivision in scenekit, but how do I make that happen for the USDZ file?thanks,Dale
Posted Last updated
.
Post not yet marked as solved
3 Replies
1.5k Views
In one of the videos, they show face occlusion in reality composer. When I set up a simple test, I can't see how or where to add a face/head occluder object. the head tracking is amazing, BTW. thanks,Dale
Posted Last updated
.