Post

Replies

Boosts

Views

Activity

Core data high memory usage workaround
I'm using core data as part of a disk scan to get a map and total sizes for each folder within that search path. The issue I'm seeing is very high memory usage as every entity remains in memory, and I've so far not found out if there's a way of reducing it The data model is this The search goes depth first, creating an entity for each folder and file, the size of each file is then added to all of it's parent and grandparent folders. But because of the parent child relationship every entity seems to remain in memory. I need to keep a reference to the parents of the current folder I'm searching so I can add to their size, and I think the children relationship of these then has references to everything underneath leading to hundreds of thousands of entities in memory during the search. Is there a way to just persist the entities so they're not kept in memory even if that means fetching a parent folder entity to update the size and then releasing it each time. It may slow the scan a lot but that would be fine to get around the memory issue. Or possibly would I need a different model to get around this? I'm currently using the children relationship to then navigate the data after the scan, but maybe there's a way I could have a searchable path string rather than a relationship to get around the issue?
0
0
335
Jul ’21
Network framework only receiving first chunk of data
Hey I've run into an issue using network framework in that I'm only receiving the first chunk of data. I really can't tell if it's how I'm using the Network.framework bits on the client side, or if it's an issue with the server side (I'm using Vapor). I put together a basic example project that hopefully demonstates the issue. If you run the server target and then the client target for me I receive 174 bytes in the first receive, receive is then called again but nothing happens after that I'm at a bit of a loss as to why https://github.com/GP89/TestStaticServer Thanks!
2
1
326
Aug ’22