I am using DuckDB as an external dependency in my project. The package is basically a Swift wrapper around C++ code.
If I run my app in Debug mode, then the performance of the library is an order of magnitude slower than when I run it in Release mode. In Release mode it is really fast, but compilation times are too slow.
I am a complete beginner to Xcode's build system and was wondering if there was any way to have the best of both worlds? For example, by compiling my SwiftUI code without optimizations but having it linked to a static and optimized version of the library.
Post
Replies
Boosts
Views
Activity
I am trying to use the DuckDB library in a SwiftUI app. It does not seem to be able to allocate as much memory as it is available to it, and I was wondering if there is any setting I can tweak. I've logged the issue in their GitHub repo too, with full details.
I am able to create a custom node graph by selecting nodes and then choosing the "Compose Node Graph" option in the context menu. After that, when I select my custom node graph, I see in the top-right panel that it is possible to define inputs and outputs. However, I was not able to figure out how to link those to the inputs and outputs in the underlying nodes.
I am new to the graph editor and was able to achieve some results. However, I am noticing that my graphs are getting very tangled, confusing, and hard to debug. I was wondering whether:
is it possible to define variables, to store the value of computations, and refer to them in other parts of the graph, without having to link them graphically? This would help in tidying the tangled mess I created. In the "Explore materials in Reality Composer Pro" video, I saw that it is possible to create "instances", but I am not sure if that is what I need. For example: does the shader compiler optimize them, so that there is no need to recompute each instance?
Is there any functionality to debug the graph, trying inputs and seeing what the numeric outputs would be?
If I create a visionOS app project, it automatically creates a RealityKitContent package. However, if I create a Multiplatform project (for visionOS, macOS, and iOS), the package is not added. Is it possible to add it manually? How? (using Xcode 16.1 beta 2)
I would like to code some RealityViews to run on my Mac first (and then incorporate them in a visionOS project) so that my code/test loop is faster, but I have not been able to find a simple example that supports Mac.
Is it possible to have volumes on a Mac? Is there support for using a game controller to move around the RealityView, like in the visionOS simulator?