I have a background thread that is updating a swift data model Item using a ModelActor. The background thread runs processing an Item and updates the Item's status field. I notice that if I have a view like
struct ItemListView: View {
@Query private var items: [Items]
var body: some View {
VStack {
ForEach(items) { item in
ItemDetailView(item)
}
}
}
}
struct ItemDetailView: View {
var item: Item
var body: some View {
// expected: item.status automatically updates when the background thread updates the `Item`'s `status`.
Text(item.status)
// actual: This text never changes
}
}
Then background updates to the Item's status in SwiftData does not reflect in the ItemDetailView. However, if I inline ItemDetailView in ItemListView like this:
struct ItemListView: View {
@Query private var items: [Items]
var body: some View {
VStack {
ForEach(items) { item in
// Put the contents of ItemDetailView directly in ItemListView
Text(item.status)
// result: item.status correctly updates when the background thread updates the item.
}
}
}
}
Then the item's status text updates in the UI as expected. I suspect ItemDetailView does not properly update the UI because it just takes an Item as an input. ItemDetailView would need additional understanding of SwiftData, such as a ModelContext.
Is there a way I can use ItemDetailView to show the Item's status and have the UI show the status as updated in the background thread?
In case details about my background thread helps solve the problem, my thread is invoked from another view's controller like
@Observable
class ItemCreateController {
func queueProcessingTask() {
Task {
let itemActor = ItemActor(modelContainer: modelContainer)
await itemActor.setItem(item)
await itemActor.process()
}
}
}
@ModelActor
actor ItemActor {
var item: Item?
func setItem(_ item: Item) {
self.item = modelContext.model(for: item.id) as? Item
}
func process() async {
// task that runs processing on the Item and updates the Item's status as it goes.
}
Post
Replies
Boosts
Views
Activity
I would like to save the depth map from ARDepthData as .tiff, but notice my output tiff distances are incorrect. Objects that are close are reported to be slightly farther away, and walls that are around 4 meters away from me have a recorded value of 2 meters. I am using this code to write the tiff:
import UIKit
# Save method
extension CVPixelBuffer {
func saveDepthMapToTIFF(to path: URL) {
let ciImage = CIImage(cvPixelBuffer: self)
let context = CIContext()
do {
try context.writeTIFFRepresentation(
of: ciImage,
to: path,
format: .Lf,
colorSpace: CGColorSpaceCreateDeviceGray()
)
} catch {
print("Failed to write TIFF: \(error)")
}
}
}
# Calling the save
arFrame.sceneDepth?.depthMap.saveDepthMapToTIFF(to: depthMapPath)
I am reading the file like this in Python
import tifffile
depth_map = tifffile.imread("test.tiff")
plt.imshow(depth_map)
plt.colorbar()
which creates this image:
The farthest parts of the room should be around 4 meters, not 2. The dark blue spot on the lower right is closer than half a meter away.
Notably the depth map contains distances from the camera plane to each region, not the distance from the camera sensor to the region. Even correcting for this though, the depth map remains about the same.
Is there an issue with how I am saving the depth image? Is there a scale factor or format error?
I have a model like this
@Model
final struct MyModel {
var urlToUnderlyingAsset: URL
}
I'd like to delete the file at the URL whenever a MyModel gets deleted. Is that possible?
My use case is that I'm trying to keep a registry of SCNScenes, but SCNScenes are not Codable. Instead I'm storing URLs to USDZ files. If there's easier ways to keep SCNScenes in a database I'm also open to those.