Using a 360 image that I have taken with 72MP with a Insta360 X3 I would like to add those images into my VisionPro and see them surrounding me completely as we expect of a 360 image. I was able to do by performing the described on some tutorial.
The problem is the quality. On my 2D window the image looks with great quality.
I will still write down the code:
struct ImmersiveView: View {
@Environment(AppModel.self) var appModel
var body: some View {
RealityView { content in
content.add(createImmersivePicture(imageName: appModel.activeSpace))
}
}
func createImmersivePicture(imageName: String) -> Entity {
let sphereRadius: Float = 1000
let modelEntity = Entity()
let texture = try? TextureResource.load(named: imageName, options: .init(semantic: .raw, compression: .none))
var material = UnlitMaterial()
material.color = .init(texture: .init(texture!))
modelEntity.components.set(
ModelComponent(
mesh: .generateSphere(
radius: sphereRadius
),
materials: [material]
)
)
modelEntity.scale = .init(x: -1, y: 1, z: 1)
modelEntity.transform.translation += SIMD3<Float>(0.0, 10.0, 0.0)
return modelEntity
}
}
Since the quality is a problem. I thought about reducing the radius of the sphere or decreasing the scale. On both cases, nothing changes.
I have tried: modelEntity.scale = .init(x: -0.5, y: 0.5, z: 0.5)
And also let sphereRadius: Float = 2000, let sphereRadius: Float = 500, but nothing is changed.
I also get the warning:
IOSurface creation failed: e00002c2 parentID: 00000000 properties: {
IOSurfaceAddress = 4651830624;
IOSurfaceAllocSize = 35478941;
IOSurfaceCacheMode = 0;
IOSurfaceMapCacheAttribute = 1;
IOSurfaceName = CMPhoto;
IOSurfacePixelFormat = 1246774599;
}
IOSurface creation failed: e00002c2 parentID: 00000000 property: IOSurfaceCacheMode
IOSurface creation failed: e00002c2 parentID: 00000000 property: IOSurfacePixelFormat
IOSurface creation failed: e00002c2 parentID: 00000000 property: IOSurfaceMapCacheAttribute
IOSurface creation failed: e00002c2 parentID: 00000000 property: IOSurfaceAddress
IOSurface creation failed: e00002c2 parentID: 00000000 property: IOSurfaceAllocSize
IOSurface creation failed: e00002c2 parentID: 00000000 property: IOSurfaceName
Is there anything I can do to reduce the radius or just to improve the quality itself?
Post
Replies
Boosts
Views
Activity
I am having problems when I first loads the app. The time it takes for the Items to be sync from my CloudKit to my local CoreData is too long.
Code
I have the model below defined by my CoreData.
public extension Item {
@nonobjc class func fetchRequest() -> NSFetchRequest<Item> {
NSFetchRequest<Item>(entityName: "Item")
}
@NSManaged var createdAt: Date?
@NSManaged var id: UUID?
@NSManaged var image: Data?
@NSManaged var usdz: Data?
@NSManaged var characteristics: NSSet?
@NSManaged var parent: SomeParent?
}
image and usdz columns are both marked as BinaryData and Attribute Allows External Storage is also selected.
I made a Few tests loading the data when the app is downloaded for the first time. I am loading on my view using the below code:
@FetchRequest(
sortDescriptors: [NSSortDescriptor(keyPath: \Item.createdAt, ascending: true)]
)
private var items: FetchedResults<Item>
var body: some View {
VStack {
ScrollView(.vertical, showsIndicators: false) {
LazyVGrid(columns: columns, spacing: 40) {
ForEach(items, id: \.self) { item in
Text(item.id)
}
}
}
}
}
Test 1 - Just loads everything
When I have on my cloudKit images and usdz a total of 100mb data, it takes around 140 seconds to show some data on my view (Not all items were sync, that takes much longer time)
Test 2 - Trying getting only 10 items at the time ()
This takes the same amount of times the long one . I have added the following in my class, and removed the @FetchRequest:
@State private var items: [Item] = [] // CK
@State private var isLoading = false
@MainActor
func loadMoreData() {
guard !isLoading else { return }
isLoading = true
let fetchRequest = NSFetchRequest<Item>(entityName: "Item")
fetchRequest.predicate = NSPredicate(format: "title != nil AND title != ''")
fetchRequest.fetchLimit = 10
fetchRequest.fetchOffset = items.count
fetchRequest.predicate = getPredicate()
fetchRequest.sortDescriptors = [NSSortDescriptor(keyPath: \Item.createdAt, ascending: true)]
do {
let newItems = try viewContext.fetch(fetchRequest)
DispatchQueue.main.async {
items.append(contentsOf: newItems)
isLoading = false
}
} catch {}
}
Test 2 - Remove all images and usdz from CloudKit set all as Null
Setting all items BinaryData to null, it takes around 8 seconds to Show the list.
So as we can see here, all the solutions that I found are bad. I just wanna go to my CloudKit and fetch the data with my CoreData. And if possible to NOT fetch all the data because that would be not possible (imagine the future with 10 or 20GB or data) What is the solution for this loading problem? What do I need to do/fix in order to load lets say 10 items first, then later on the other items and let the user have a seamlessly experience?
Questions
What are the solutions I have when the user first loads the app?
How to force CoreData to query directly cloudKit?
Does CoreData + CloudKit + NSPersistentCloudKitContainer will download the whole CloudKit database in my local, is that good????
Storing images as BinaryData with Allow external Storage does not seems to be working well, because it is downloading the image even without the need for the image right now, how should I store the Binary data or Images in this case?
I am trying to count a database table from inside some of my classes.
I am tying to do this below **(My problem is that count1 is working, but count2 is not working.)
**
class AppState{
private(set) var context: ModelContext?
....
func setModelContext(_ context: ModelContext) {
self.context = context
}
@MainActor
func count()async{
let container1 = try ModelContainer(for: Item.self)
let descriptor = FetchDescriptor<Item>()
let count1 = try container1.mainContext.fetchCount(descriptor)
let count2 = try context.fetchCount(descriptor)
print("WORKING COUNT: \(count1)")
print("NOTWORKING COUNT: \(count2) -> always 0")
}
I am passing the context like:
...
@main
@MainActor
struct myApp: App {
@State private var appState = AppState()
@Environment(\.modelContext) private var modelContext
WindowGroup {
ItemView(appState: appState)
.task {
appState.setModelContext(modelContext)
}
}
.windowStyle(.plain)
.windowResizability(.contentSize)
.modelContainer(for: [Item.self, Category.self]) { result in
...
}
Can I get some guidance on why this is happening?
Which one is better to use?
If I should use count2, how can I fix it?
Is this the correct way to search inside an application using SwiftData ?
I don't wanna search using the View like @Query because this operation is gonna happen on the background of the app.
I am trying to store usdz files with SwiftData for now.
I am converting usdz to data, then storing it with SwiftData
My model
import Foundation
import SwiftData
import SwiftUI
@Model
class Item {
var name: String
@Attribute(.externalStorage)
var usdz: Data? = nil
var id: String
init(name: String, usdz: Data? = nil) {
self.id = UUID().uuidString
self.name = name
self.usdz = usdz
}
}
My function to convert usdz to data. I am currently a local usdz just to test if it is going to work.
func usdzData() -> Data? {
do {
guard let usdzURL = Bundle.main.url(forResource: "tv_retro", withExtension: "usdz") else {
fatalError("Unable to find USDZ file in the bundle.")
}
let usdzData = try Data(contentsOf: usdzURL)
return usdzData
} catch {
print("Error loading USDZ file: \(error)")
}
return nil
}
Loading the items
@Query private var items: [Item]
...
var body: some View {
...
ForEach(items) { item in
HStack {
Model3D(?????) { model in
model
.resizable()
.scaledToFit()
} placeholder: {
ProgressView()
}
}
}
...
}
How can I load the Model3D?
I have tried:
Model3D(data: item.usdz)
Gives me the errors:
Cannot convert value of type '[Item]' to expected argument type 'Binding<C>'
Generic parameter 'C' could not be inferred
Both errors are giving in the ForEach.
I am able to print the content inside item:
ForEach(items) { item in
HStack {
Text("\(item.name)")
Text("\(item.usdz)")
}
}
This above works fine for me.
The item.usdz prints something like Optional(10954341 bytes)
I would like to know 2 things:
Is this the correct way to save usdz files into SwiftData? Or should I use FileManager? If so, how should I do that?
Also how can I get the usdz from the storage (SwiftData) to my code and use it into Model3D?
I have some usdz files saved and I would like to make thumbnails for them in 2D of course. I was checking Creating Quick Look Thumbnails to Preview Files in Your App but it says Augmented reality objects using the USDZ file format (iOS and iPadOS only) I would like to have the same functionality in my visionOS app. How can I do that?
I thought about using some api to convert 3d asset into 2d asset, but it would be better If I could do that inside the Swift environment.
Basically I wanna do Image(uiImage: "my_usdz_file")
I was executing some code from Incorporating real-world surroundings in an immersive experience
func processReconstructionUpdates() async {
for await update in sceneReconstruction.anchorUpdates {
let meshAnchor = update.anchor
guard let shape = try? await ShapeResource.generateStaticMesh(from: meshAnchor) else { continue }
switch update.event {
case .added:
let entity = ModelEntity()
entity.transform = Transform(matrix: meshAnchor.originFromAnchorTransform)
entity.collision = CollisionComponent(shapes: [shape], isStatic: true)
entity.components.set(InputTargetComponent())
entity.physicsBody = PhysicsBodyComponent(mode: .static)
meshEntities[meshAnchor.id] = entity
contentEntity.addChild(entity)
case .updated:
guard let entity = meshEntities[meshAnchor.id] else { continue }
entity.transform = Transform(matrix: meshAnchor.originFromAnchorTransform)
entity.collision?.shapes = [shape]
case .removed:
meshEntities[meshAnchor.id]?.removeFromParent()
meshEntities.removeValue(forKey: meshAnchor.id)
}
}
}
I would like to toggle the Occlusion mesh available on the dev tools below, but programmatically. I would like to have a button, that would activate and deactivate that.
I was checking .showSceneUnderstanding but it does not seem to work in visionOS. I get the following error 'ARView' is unavailable in visionOS when I try what is available Visualizing and Interacting with a Reconstructed Scene