Yup same experience - even with a brand new Xcode SwiftUI template project.
Confirm that the "Debug process as root" fixes it for now...
Post
Replies
Boosts
Views
Activity
I should add this is with Xcode 13 beta 3.
I've worked out that I need to change init and encode to async functions:
@MainActor final class MyClass: Codable {
var value: Int
enum CodingKeys: String, CodingKey {
case value
}
init(from decoder: Decoder) async throws {
let data = try decoder.container(keyedBy: CodingKeys.self)
self.value = try data.decode(Int.self, forKey: .value)
}
func encode(to encoder: Encoder) async throws {
var container = encoder.container(keyedBy: CodingKeys.self)
try container.encode(value, forKey: .value)
}
}
I've worked out a solution - it's taken my way too long: I feel daft looking back on my first clumsy attempt.
Two of my key mistakes were to put the my RecordViewModel object (which needed to conform to ObservableObject) on the MainActor, and to create an actor type, RecordsModel, to hold the property that the needed to be isolated.
Here's the original attempt:
actor RecordsModel: Decodable {
var records: [Record] = []
enum CodingKeys: String, CodingKey { case records }
init() {}
init(from decoder: Decoder) async throws { ... }
// Unable to conform to Encodable at present with this implementation
func encode(to encoder: Encoder) throws { ... }
func addRecord() -> [Record] {
self.records.append(Record(value: Int.random(in: 0...10))) // Assume it takes a long time to compute `value`
return self.records
}
}
@MainActor
class RecordsViewModel: ObservableObject {
@Published var records: [Record]
private let recordsModel: RecordsModel
init() {
self.records = []
self.recordsModel = RecordsModel()
}
init(fromRecordsModel recordsModel: RecordsModel) async {
self.records = await recordsModel.records
self.recordsModel = recordsModel
}
func addRecord() {
// Given addRecord takes time to complete, we run it in the background
Task {
self.records = await recordsModel.addRecord()
}
}
}
My new approach doesn't create an actor, but puts a property isolatedRecords in the view model, isolated with a global actor. This is complimented by a non-isolated published version, which is updated on the MainActor after any updates to its isolated twin.
Here's the new view model class:
final class Records: ObservableObject, Codable {
@Published var records: [Record]
@MyActor private var isolatedRecords: [Record]
init() {
self.records = []
self.isolatedRecords = []
}
enum CodingKeys: String, CodingKey { case records }
init(from decoder: Decoder) throws { ... }
func encode(to encoder: Encoder) throws { ... }
@MyActor func append(_ value: Int) -> [Record] {
self.isolatedRecords.append(Record(value))
return isolatedRecords
}
func addRecord() {
Task() {
let newNumber = Int.random(in: 0...10) // Assume lots of processing here, hence we run it as a Task
let newRecords = await self.append(newNumber)
await MainActor.run { self.records = newRecords }
}
}
}
This has succeeded in ensuring the code is free of race-conditions, whilst keeping processing and update syncronisation of the record's array off the main thread, and removing all the other issues I encountered such as trying to conform the actor to Encodable, putting the Document on the MainActor and more. The code is more elegant also.
I've left the full updated project at GitHub
One thing to note here - tying your whole mode to the main actor has consequences: for example not being able to conform it to Encodable for one.
A more nuanced approach is to create a global actor, and annotate only the properties that need to be isolated in the view model. I discuss this more here: SwiftUI macOS document app architecture in a concurrent world
Quinn - that's super helpful. Thank you!
Having looked and thought carefully about this, I have found that adding Task.yield() solves the issue, as I proactively await:
final class NewFileCounter: ObservableObject {
@Published var fileCount = 0
func findImagesInFolder(_ folderURL: URL) {
let fileManager = FileManager.default
Task.detached {
var foundFileCount = 0
let options = FileManager.DirectoryEnumerationOptions(arrayLiteral: [.skipsHiddenFiles, .skipsPackageDescendants])
if let enumerator = fileManager.enumerator(at: folderURL, includingPropertiesForKeys: [], options: options) {
while let _ = enumerator.nextObject() as? URL {
foundFileCount += 1
await Task.yield()
if foundFileCount % 10_000 == 0 {
let fileCount = foundFileCount
await MainActor.run { self.fileCount = fileCount }
}
}
let fileCount = foundFileCount
await MainActor.run { self.fileCount = fileCount }
}
}
}
}
I've resorted to prefixing any console output with a unique string, say peggers then filtering this in the console window. The issue is though that I might miss an important output from somewhere not in my code, drawing my eye to an issue. But such output is swamped anyway at the moment.
Is this the same problem that's impacting the following code snippet:
let urls: [URL] = [url1, url2, url3, ... ] // Image we have 1,000s of URLs here
await withTaskGroup(of: CGImageSource.self) { taskGroup in
for url in urls {
taskGroup.addTask { return CGImageSourceCreateWithURL(url as CFURL, nil) }
}
var results = [CGImageSource]()
for await result in taskGroup {
results.append(result)
}
return results
}
I don't have a way to await the call to CGImageSourceCreateWithURL - this code blocks up.
OK - my bad! I set-up a separate project, away from my main code base that test running multiple FileManager enumerations in parallel (using Tasks) and it all worked...couldn't provoke it no matter how hard I prodded it. And then my eyes flicked back to my main code and a small dim light bulb in my head went on....four hours later I've tracked down a somewhat subtle bug in my code.
thank you for your responses!
I have worked out the issue. On iOS one needs to set up an NSMetadataQuery in addition to using a NSFilePresenter. Something like:
let metadataQuery = NSMetadataQuery()
metadataQuery.notificationBatchingInterval = 1
metadataQuery.searchScopes = [NSMetadataQueryUbiquitousDocumentsScope]
metadataQuery.predicate = NSPredicate(format: "%K LIKE %@", NSMetadataItemFSNameKey, "*.txt")
metadataQuery.start()
And then establish a process to listen for changes. For example using Combine:
NotificationCenter.default.publisher(for: .NSMetadataQueryDidUpdate)
.receive(on: DispatchQueue.main)
.sink { [weak self] notification in
guard let self = self else { return }
self.metadataQuery.stop()
for resultSets in 0..<self.metadataQuery.resultCount {
if let resultSet = self.metadataQuery.result(at: resultSets) as? NSMetadataItem {
/// DO YOUR MAGIC HERE...
}
}
self.metadataQuery.start()
}
.store(in: &backgroundProcesses)
Thank you! I have a few clarifying questions:
When in kinematic mode is the ModelEntity ignored for collision detection?
When I add an entity to RealityView's content, via .add() what Anchor does this attach it to? If I have an anchor with multiple ModelEntities attached, they each have to be added to the content to be picked up by the physics engine? But when I do this, they ignore the anchor's translation and default to the world view.
And finally, when simulating something like a bullet/arrow, is it better to use a TriggerVolume component for efficiency?
Hey, Thank you! Phil
I am seeing exactly the same. I have a 'LazyVStack' enclosed in a 'ScrollView', in which I can be displaying thousands of elements. Definite slow down. I've pulled my code apart and still can't get an improvement.
Thank you for your reply. I've replicated these steps and have the effect working as described.
Could I ask a clarification though...
This approach requires each model to have its shader graph updated in Reality Composer Pro to give it this capability. I understand that I could create a re-usable node graph if I wanted to have a more complex effect (the colouring was just a simple effect for sake of the question) - yet still each entity would need updating in Reality Composer Pro to wire it in.
Thus, just to be sure, there is no way to create a shader that can be applied to any model entity. I see RealityKit provides CustomMaterial, which can be setup with a shader, but this is not available in VisionOS at present?
CustomMaterial