You can film regular stereoscopic left/right content and encode it into MV-HEVC using Halocline. I am affiliated with the project, so please reach out if you have any questions or concerns.
Post
Replies
Boosts
Views
Activity
Stereoscopic videos which are encoded into MV-HEVC format can be opened up in the native player as well. You can see an example of a left/right stereoscopic video encoded into MV-HEVC here:
https://x.com/zacharyhandshoe/status/1713745034233749926?s=20
You can encode a left/right stereoscopic video into MV-HEVC on Halocline, https://haloclinetech.com. I am directly involved in its production, so I would be happy to help out with any of your needs.
You can encode a left/right stereoscopic video into MV-HEVC on Halocline, https://haloclinetech.com. I am directly involved in the app's production, so I would be happy to help out with any of your needs.
Did you figure this out?
Have you found a solution?
Disclosure: I'm affiliated with Halocline.
This involves a few short steps:
Storing the assets on a remote server: For this purpose, you'd need to host your assets on a remote server. Halocline (https://www.haloclinetech.com/) offers a unique, dedicated platform for this specific need, and the best part is that it's free! Given my experience with it, I'd recommend checking it out.
Fetching the assets in your app: Once you have a URL for your asset, you can use the provided script from Halocline (https://www.haloclinetech.com/learn/) which helps store files from remote URLs in the user’s temporary directory.
Using the assets in your app: After fetching the assets, you can incorporate them into your application as if they were locally stored.
let halocline = HaloclineHelper()
let url = URL(string: "your-halocline-url")!
let modelEntity = try await halocline.loadEntity(from: url)
How are you currently displaying the skybox? Are you using it in a RealityView?
This would be helpful. When calling the Swift openURL() method, Safari is opening at the origin (below the camera) instead of in front of the user.
If I open Safari via the home page, it opens in front of the user. Seems like a bug with Vision OS.
I was able to accomplish this with a DragGesture() instead. I was having difficulties with the RotateGesture3D
Model3D(named: "Purse", bundle: realityKitContentBundle){ model in
model
.resizable()
.aspectRatio(contentMode: .fit)
} placeholder: {
ProgressView()
}
.rotation3DEffect(rotation, axis: rotationAxis)
.gesture(
DragGesture()
.onChanged { value in
// Calculate rotation angle
let angle = sqrt(pow(value.translation.width, 2) + pow(value.translation.height, 2))
rotation = Angle(degrees: Double(angle))
// Calculate rotation axis
let axisX = -value.translation.height / CGFloat(angle)
let axisY = value.translation.width / CGFloat(angle)
rotationAxis = (x: axisX, y: axisY, z: 0)
}
)
This is a large issue with the current documentation and developer tools. I have found basic Swift UI tutorials more informative. Starting with SwiftUI tutorials would be the best bet.
Am I correct that XCTest is only usable in code testing? If so, how exactly would we simulate this during development?
While models from Reality Composer Pro show up, their display is near identical to Mixed.
How does one build a skybox or other objects which replace the user's room? Apologies for so many questions. The documentation seems to lack nuance on how to build much beyond making a model show up on a table.
ImmersiveSpace(id: "ImmersiveSpace") {
ImmersiveView()
}.immersionStyle(selection: .constant(.full), in: .full)