The limit on a physical iPad varies by device. The highest I'm aware of is 12GB with the entitlement, 5GB without. That's on an iPad Pro with 16GB of physical RAM. Again, there's no official documentation. That's just what others have reported here and on other sites.
This thread isn't really about optimizing my specific app, it's about the system limits apparently being imposed on iPad-on-Mac apps, which I haven't seen discussed anywhere else.
But to answer your other questions briefly:
It isn't really "my" memory, it's the overhead of the NSOperationQueue and Swift copy-on-write. My "dataset" is quite small, there just happen to be lots of copies of it floating around in memory as each possible permutation gets explored. The 50GB is a peak; actual memory usage varies considerably during the run as the queue drains and refills and the Swift runtime collects its garbage.
Not feasible at all. This is a shipping app that needs to pass App Store Review.
Keep in mind that I'm running a stress test that is not representative of real-world, human-scale puzzles. Actual puzzles from magazines, Sunday Times, etc. get solved in under a second with 0.1MB - 0.2MB of memory consumption.
I was just hoping that I could run the stress test on the actual, shipping App Store build using my new M2 MBP that I picked up this week. But apparently not.