Discuss augmented reality and virtual reality app capabilities.

Posts under AR / VR tag

111 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

How to overlay an image on part of a 3D model?
How to overlay an image in RealityKit on a 3D model using code so that it does not stretch to the entire object, but has its own height and width that I can change? I have a solution on how to do this, but then it will not be possible to change the height, width or place it anywhere on the 3D model. And this is to cut out a part of the object and overlay the image on the entire cutout area. How to overlay a 2D image on a 3D model without stretching the photo to the entire 3D object? If this is possible, please give an example of how to do this in code. I could not find on the Internet how to do this. Although in other engines this can be done, for example, in Blender or Unity. If I am not mistaken, this is done there using decals
0
1
358
Jul ’24
ImageAnchoringSource from URL
Hello, I was wondering how I can initialize an ImageAnchoringSource using https://developer.apple.com/documentation/realitykit/anchoringcomponent/imageanchoringsource/init(_:) When I construct one using a URL, it doesn't seem to be tracked and I see in the following when I debug print the component: ▿ 0 : AnchoringComponent ▿ target : Target ▿ referenceImage : 1 element ▿ from : ImageAnchoringSource ▿ url : Optional<URL> ▿ some : file:///var/mobile/Containers/Data/Application/D1126EA0-A1D7-468F-A40C-8578B7F5BDDF/Library/Caches/CodeCache/0E457AA7-2195-48B9-9DD4-58CEB9397F69.png - _url : file:///var/mobile/Containers/Data/Application/D1126EA0-A1D7-468F-A40C-8578B7F5BDDF/Library/Caches/CodeCache/0E457AA7-2195-48B9-9DD4-58CEB9397F69.png - _parseInfo : nil - _baseParseInfo : nil - name : nil - group : nil ▿ trackingMode : TrackingMode - trackingMode : 2 Is there a specific format for the parseInfo? When I use the same image to make an image anchoring source by group and name in AR Resources, it is tracked. Thank you!
1
0
294
Jul ’24
visionOS 2 full immersive space permission change?
Does visionOS 2 still prompt the user with a permission alert when a full immersive space is presented? In visionOS 1, the first time an app presented an immersive space, the user was prompted with an alert to grant permission. openImmersiveSpace would return an error code if the user opted not to grant permission. In visionOS 1, it was important to handle this case correctly. In visionOS 1, the Settings > Developer menu had an option to reset the immersive user's space permission prompting state so developers could test this interaction flow. In visionOS 2, I no longer see the full immersive space permissions alert. I can't remember if I saw it once, the first time visionOS 2.0 beta was installed, or if I never saw it at all. The Settings > Developer menu no longer has an option to reset the permission prompting state. I can't find any way to test the interaction flow in my app to make sure that it will work correctly for users. Does visionOS 2 no longer ask for full immersive space permission at all? I can't find this change documented anywhere. If visionOS 2 does prompt the user for permission, is there any way to reproduce and test this interaction flow so I can make sure my app handles it correctly? Thanks for taking the time to answer this question.
3
0
637
Aug ’24
CreateML framework for Object Tracking
We can use the CreateML App to build object tracking model in Xcode 16, but is it possible to use CreateML framework as well? No documentation of Create ML object tracking is found yet. The latest documentation I can found is Xcode 15. https://developer.apple.com/documentation/CreateML?changes=latest_minor Really apricated the new feature of object tracking, thank you Apple Team.
2
0
641
Jul ’24
Convert .reality into USDZ
Hey Everyone, this is like my first post here in the apple forum. I need your help to understand better Reality Kit and file exports, but let me explain. I'm trying to create a little 3D Object editor, and it looks like to work pretty well using RealityViews and managing materials on the Entity. I'm currently working with all the Beta Apis and I would like to export my entity into an .usdz or a .obj file. I've found a method that allows me to create a .Reality File let path = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0].appendingPathComponent("model.reality") try await self.appState.parentEntity.write(to: path) but I now I don't know how to convert it into a .usdz or a .obj file, or otherwise any standard 3d format. Do you have any idea on how could I do? Thankyou so much! Have a nice day ^^
1
0
504
Jul ’24
Augmented Reality app unable to load the image from the camera
I have an app on the App Store for many years enabling users to post text into clouds in augmented reality. Yet last week abruptly upon installing the app on the iPhone the screen started going totally dark and a list of little comprehensible logs came up of the kind: ARSCNCompositor <0x300ad0e00>: ARSCNCompositor (0, 0) initialization failed. Matting is not set up properly. many times, then RWorldTrackingTechnique <0x106235180>: Unable to update pose [PredictorFailure] for timestamp 870.392108 ARWorldTrackingTechnique <0x106235180>: Unable to predict pose [1] for timestamp 870.392108 again several times and then: ARWorldTrackingTechnique <0x106235180>: SLAM error callback: Error Domain=Slam Error Code=7 "Non fatal error occurred due to significant drop in a IMU data" UserInfo={NSDescription=Non fatal error occurred due to significant drop in a IMU data, NSLocalizedFailureReason=SlamEngineNodeGroup Failure: IMU issue: gyro data stream verification failed [Significant data drop]. Failed on timestamp: 870.413247, Last known timestamp: 865.350198, Delta: 5.063049, System timestamp: 870.415781, Delta between system and frame: 0.002534. } and then again the pose issues several times. I hoped the new beta version would have solved the issue, but it was not the case. Unfortunately I do not know if that depends on the beta version or some other issue, given the app may be not installed on the Mac simulator.
11
1
845
2w
PortalComponent invalid
it is my code let portal = Entity() portal.components[ModelComponent.self] = .init(mesh: .generatePlane(width: Float(size.width), height: Float(size.height), cornerRadius: 0.02), materials: [PortalMaterial()]) portal.components[PortalComponent.self] = .init(target: world) portal.components[PortalComponent.self]?.clippingPlane = .init(position: SIMD3(x: 0, y: 0, z: 0), normal: SIMD3(x: 0, y: 0, z: 0)) portal.components.set(HoverEffectComponent()) I added RealityView to multiple HStacks and implemented the portal effect. I found that the portal effect would cause confusion in the rendering level on some machines, as shown in the figure
2
0
455
Jul ’24
HandTrackingProvider duplicate updates in 2.0
Anybody try hand tracking provider in 2.0? I'm getting them in 11ms interval, as advertised, but they are duplicate. Here's a print of the timestamps. Problematic for me because I am tracking the last 5 position for a calculation and expect them to be unique. Can't find docs on this anywhere. I understand it's not truly 90 updates a second but predicted pose, however I expected the updates to include predicted poses.
0
0
369
Jun ’24
Object Tracking with RealtyView
When I wanted to call the Reality Composer Pro scene containing Object Tracking, I tried the following code: RealityView { content in if let model = try? await Entity(named: "Scene", in: realityKitContentBundle) { content.add(model) } } Obviously, this is wrong. We need to add some configurations that can enable Object Tracking to Reality View. What do we need to add?
2
0
550
Jul ’24
State-of-the-Art 3D (no AR) on macOS using RealityKit?
What is the current recommendation for creating high-quality 3D content? The context is a hobbyist, specialised CAD app for macOS (with an iPadOS companion) that is mostly 2D but also offers a 3D visualization option (currently OpenGL). Somewhere down the line there might be an AR view but at the moment - certainly for macOS - it's purely generated 3D visualization, all rendered content. So starting with a rewrite of the 3D visualization in 2024 targeting macOS Sequoia/iPadOS 18 is RealityKit the suggested way forward? Cheers, Jay
4
0
860
Jun ’24
USD/Z schemas and Declarations
How is it possible to add a schema for ar to a usd file using the python tools (or any other way). Following the instructions in: https://developer.apple.com/documentation/arkit/arkit_in_ios/usdz_schemas_for_ar/actions_and_triggers/preliminary_behavior The steps are to have the following declaration: class Preliminary_Behavior "Preliminary_Behavior" ( inherits = </Typed> ) and a usd file #usda 1.0 def Preliminary_Behavior "TapAndFlip" { rel triggers = [ <Tap> ] rel actions = [ <Entry> ] def Preliminary_Trigger "Tap" ( inherits = </TapGestureTrigger> ) { rel affectedObjects = [ </Cube> ] } def Preliminary_Action "Entry" ( inherits = </GroupAction> ) { uniform token type = "parallel" rel actions = [ <Flip> ] } def Preliminary_Action "Flip" ( inherits = </EmphasizeAction> ) { rel affectedObjects = [ </Cube> ] uniform token motionType = "flip" } } def Cube "Cube" { } How do these parts fit together? I saved the usda file, but it didn't have any interactions. Obviously, I have to add that declaration, but how do I do this? is this all in an AR Xcode project? Or can I do this with python tools (I would prefer something very lightweight).
0
0
534
May ’24
Deferred Rendering Supported?
Using Unreal Engine 5.4 for Apple Vision Pro. Creating a fully immersive VR Experience. When deploying a VR Application, can you use deferred rendering for the Apple Vision Pro? Or do you need to use forward shading, like for mobile devices? My goal would be to use deferred rendering, because of the much better shader options and quality. And I hope that the Apple Vision Pro with the integrated CPU and GPU could handle deferred rendering, that like a MacBook or a powerful Gaming PC, Workstation. I couldn't find any information on that. Have been mainly developing VR Applications for Quest, but would love to create apps for the Apple Vision Pro. But I would need to know if defered rendering will work when deploying VR Apps from the Unreal Engine to the Apple Vision Pro system. Thanks a lot for any more information on that topic, appreciate it! all the best, Bernhard
0
0
602
May ’24