I am working on a RealityView on iOS 18 that needs to render objects farther away than 1,000 meters. My app is used outside in open areas. I am using RealityView with content.camera = .spatialTracking and I have turned off occlusion, collisions, and plane detection with a simple scene understanding like this.
let configuration = SpatialTrackingSession.Configuration(
tracking: [.camera],
sceneUnderstanding: [], //We don't want occlusions, collisions, etc
camera: .back)
let session = SpatialTrackingSession()
if let unavailable = await session.run(configuration) {
print("unavailable \(unavailable)")
}
Is this possible with spatialTracking with RealityView or with ARView?
I have my RealityView working on visionOS inside an ImmersiveSpace. On visionOS I don't have the camera as a passthrough, it is virtual scene and it has wold tracking set up via the WorldTrackingProvider and I can render objects father away than 1000 meters. I would like to do the same thing on iOS. I don't need to have the camera pass through, but I do need to have the world tracking.
I see that PerspectiveCameraComponent lets me set the near and far clipping planes, but I don't see how I can use that camera with world tracking.
Post
Replies
Boosts
Views
Activity
I have a live activity and I can't get the colors to work correctly when the iPhone has the focus mode set to do not disturb. My activity uses the system background and foreground. The live activity goes completely black. I have tried setting activityBackgroundTint and activitySystemActionForegroundColor to nil but that still gives a black activity that looks like this.
I have a RealityView in my visionOS app. I can't figure out how to access RealityRenderer. According to the documentation (https://developer.apple.com/documentation/realitykit/realityrenderer) it is available on visionOS, but I can't figure out how to access it for my RealityView. It is probably something obvious, but after reading through the documentation for RealityView, Entities, and Components, I can't find it.
I am working on a fully immersive RealityView for visionOS and I need to add light from the sun to my scene. I see that DirectionalLight, PointLight, and SpotLight are not available on visionOS. Does anyone know how to add light to fully immersive scene on visionOS?
My scene is really dark right now without any additional light.
Does anyone know how to create a sidebar like Safari in the visionOS simulator where the web content does not change size and the sidebar opens by sliding out from behind the content? Everything I have tried opens the sidebar in the same window and makes the content smaller. I tried making the sidebar as an ornament, but that didn't feel like the right solution.
You can see in this screenshot that the sidebar is coming out from behind the content. It is not an ornament on the side of the content. It is also at the same z level as the content.
I have a SceneKit fragment shader where I need to make some decisions about the alpha value based on an interpolated value between the vertexes. At each vertex there is a value and I need to know what the interpolated value for the given position that the fragment shader is determining the color for. Is there a way to do this?
This is what my shader looks like.
#pragma arguments
float myInterpolatedVlaue;
#pragma transparent
#pragma body
//Sample logic. The actual logic has more too it.
if (myInterpolatedVlaue > 1.0 ) {
_output.color.a = 0;
}
I want to set a value at each vertex and then get the interpolated value in my fragment shader. I would like to have myInterpolatedVlaue be set based on the interpolated values from the value at each vertex.