Post

Replies

Boosts

Views

Activity

Issue with Vertex Animation Scaling Not Working in Reality Composer Pro
Hi everyone, I’m working on a project in Reality Composer Pro, and I’ve encountered an issue with vertex animations. Here’s what I’ve done: I created a model in Blender, which includes two animations: One that scales the vertices of the model. Another that moves the model's position. I imported both animations into Reality Composer Pro, and the position animation works fine, but the vertex scaling animation does not seem to work. What I’m Trying to Achieve: I want the vertex scaling animation to play correctly in Reality Composer Pro alongside the movement animation. Problem: The position animation works as expected, but the vertex scaling animation does not work when applied in Reality Composer Pro. I have checked the vertex scaling animation in other software, and it works fine there. The issue seems to be specific to Reality Composer Pro. Is vertex animation scaling supported in Reality Composer Pro? If so, what might be causing this issue? Any advice or solutions would be greatly appreciated!
0
0
158
2w
Issue with Pivot Points of Primitive Shapes in Reality Composer Pro for visionOS App
Hi everyone, I'm developing a visionOS app using SwiftUI and RealityKit, and I'm encountering an issue with the pivot points of primitive shapes created in Reality Composer Pro. Scenario: When I use Reality Composer Pro within Xcode to add primitive shapes (such as cubes, capsules, etc.) to my scene, the pivot points for these objects seem to be set incorrectly. The pivot is located far from the actual object, which affects transformations and positioning. Question: Is there a way to correct or adjust the pivot point for primitive shapes created in Reality Composer Pro? Additional Information: I’ve attached a screenshot illustrating the issue with the pivot point being misaligned. Any guidance on how to resolve this would be greatly appreciated. Thanks in advance for your help! Best, Siddharth
1
0
409
Sep ’24
Model and Text Overlap Issue in Vision Pro App After GitHub Push
I’m developing an app for Vision Pro and have encountered an issue related to the UI layout and model display. Here's a summary of the problem: I created an anchor window to display text and models in the hand menu UI. While testing on my Vision Pro, everything works as expected; the text and models do not overlap and appear correctly. However, after pushing the changes to GitHub and having my client test it, the text and models are overlapping. Details: I’m using Reality Composer Pro to load models and set them in the hand menu UI. All pins are attached to attachmentHandManu, and attachmentHandManu is set to track the hand and show the elements in the hand menu. I ensure that the attachmentHandManu tracks the hand properly and displays the UI components correctly in my local tests. Question: What could be causing the text and models to overlap in the client’s environment but not in mine? Are there any specific settings or configurations I should verify to ensure consistent behavior across different environments? Additionally, what troubleshooting steps can I take to resolve this issue?
1
0
538
Aug ’24
Is it possible to display a live 3D view of Google Maps in a Vision Pro app?
Hi everyone, I’m exploring the idea of displaying a live 3D view of Google Maps in a Vision Pro app using SwiftUI and RealityKit. I want users to be able to interact with the map, including panning, zooming in and out, and exploring different areas in a fully immersive environment. Is this technically possible within the Vision Pro ecosystem? If so, what would be the recommended approach to implement this? If not, are there any alternative methods or platforms that could provide a similar experience? Thanks in advance for your insights! Best, Siddharth
1
0
828
Aug ’24
Hi everyone,I'm developing a visionOS app using SwiftUI and RealityKit. I'm facing a challenge with fetching step data from HealthKit.
I have integrated HealthKit into my app to track and display step data. On my iPhone, the step data syncs and shows up immediately. However, when I try to fetch and display this data on my Vision Pro app, there's a noticeable delay before the data appears, even though I've given the necessary permissions and synced data through iCloud. Memory updated Here is your question in a similar format: Hi everyone, I'm developing a visionOS app using SwiftUI and RealityKit. I'm facing a challenge with fetching step data from HealthKit. Scenario: I have integrated HealthKit into my app to track and display step data. On my iPhone, the step data syncs and shows up immediately. However, when I try to fetch and display this data on my Vision Pro app, there's a noticeable delay before the data appears, even though I've given the necessary permissions and synced data through iCloud. Question: Why is there a delay in fetching step data from HealthKit on my Vision Pro app, and how can I reduce this delay to ensure real-time data display? Additional Information: I'm using HKStatisticsQuery to fetch the step data for the current day. The authorization for HealthKit is requested successfully. The data shows up after some time, but the initial delay is causing a poor user experience. Here is the relevant code snippet for fetching today's steps: class HealthManager: ObservableObject { let healthStore = HKHealthStore() @Published var activities: [String: Activity] = [:] init() { let steps = HKQuantityType(.stepCount) let healthTypes: Set = [steps] Task { do { try await healthStore.requestAuthorization(toShare: [], read: healthTypes) } catch { print("error fetching health data") } } } func fetchTodaySteps() { let steps = HKQuantityType(.stepCount) let predicate = HKQuery.predicateForSamples(withStart: .startOfDay, end: Date()) let query = HKStatisticsQuery(quantityType: steps, quantitySamplePredicate: predicate) { _, result, error in guard let quantity = result?.sumQuantity(), error == nil else { print("error fetching today's step data") return } let stepCount = quantity.doubleValue(for: .count()) print("stepCount \(stepCount)") } healthStore.execute(query) } } I'm open to alternative approaches if this method isn't the most efficient for real-time data fetching. Thanks in advance for any insights or suggestions! Best, Siddharth
3
0
659
Aug ’24
Error Loading USDZ File in Vision Pro Application
Hi everyone, I'm working on a Vision Pro application and encountering an issue while trying to load a USDZ file. Here are the details: File Path: /Users/siddharthpatel/Library/Developer/CoreSimulator/Devices/31F10013-50B6-4CEF-9388-9094087FAEBF/data/Containers/Data/Application/EB260F0A-A84F-4E95-876D-08199D2A4998/Documents/hive1.usdz Code: do { try await modelEntityForCollider = ModelEntity(contentsOf: fileURL!) } catch { print("Error loading model: (error)") } Error: Thread 1: Fatal error: Failed to import entity from "/Users/siddharthpatel/Library/Developer/CoreSimulator/Devices/31F10013-50B6-4CEF-9388-9094087FAEBF/data/Containers/Data/ ... ve1.usdz" I've verified that the file path is correct and the USDZ file exists at the specified location. What could be causing this error and how can I resolve it? Thanks in advance for your help! Siddharth
2
0
560
Jun ’24
Disabling New Hand Gesture Features in Vision Pro App on visionOS 2
Question: Hi everyone, I'm developing a Vision Pro app using the latest visionOS 2, and I've encountered some issues with the new hand gestures introduced in this update. My app is designed to display a UI element when a user's palm is detected. However, the new hand gestures for navigating key functions like Home View, Control Center, and adjusting the volume are interfering with my app's functionality. What I'm Trying to Achieve Detect when a user's palm is open and display a UI element. Ensure that my app's custom hand gestures are not disturbed by the new default gestures in visionOS 2. Problem The new hand gestures in visionOS 2 (such as those for Home View, Control Center, and volume adjustment) are activating while my app is open, causing disruptions to my app's functionality. I want to disable these system-level gestures when my app is running.
3
2
1.4k
Jun ’24
Accessing Children of USDZ Model in SwiftUI
Hi everyone, I'm developing a visionOS app using SwiftUI and RealityKit. I'm facing a challenge with accessing the children of a USDZ model. Scenario: I can successfully load and display a USDZ model in my RealityView. When I import the model into Reality Composer Pro, I can access and manipulate its individual parts (children) using the scene hierarchy. However, if I directly load the USDZ model from the Navigation tab in my project, I cannot seem to access the children programmatically. Question: Is there a way to access and manipulate the children of a USDZ model loaded directly from the Navigation tab in SwiftUI for visionOS? Additional Information: I've explored using Entity(named: "childName", in: realityKitContentBundle), but this only works for the main entity. I'm open to alternative approaches if directly accessing children isn't feasible. Thanks in advance for any insights or suggestions! Best, Siddharth [Edited by Moderator]
2
0
964
May ’24