My error here was in how I created the Signposter. To get a segment to show up on the PointsOfInterest in the profiler, you need to create it with the preset category of .pointsOfInterest. For example, instead of:
signposter = OSSignposter(logger: logger)
I should have set it:
signposter = OSSignposter(subsystem: "MySubsystem", category: .pointsOfInterest)
(thank you James Dempsey) - who also pointed out that I could add the different os_signpost instrument to see the ones that I'd created with the original method. They just didn't show up in the PointsOfInterest segment of the time profiler.
Post
Replies
Boosts
Views
Activity
I was just hitting the same issue recently, found this, and noted a possible solution.
Using Xcode 15.4, when I used the -derivedDataPath argument, DONT specify the value with an = sign.
For example:
mkdir -p myDD
xcodebuild -scheme MyApp -derivedDataPath ./myDD test
worked as I hoped it might
It seems to be more related to a regression or change in Xcode 16 beta 5 vs Xcode 15.4 than that. In Xcode 15.4, it works perfectly, regardless of the name of the folder.
When I rename the file and compile in Xcode 16 beta, using either the new format Xcode project or old one, it doesn't appear to be loading correctly.
I've updated the sample project to change the name of the directory, but it doesn't appear to make a difference, and the product built by Xcode 16 still fails to include the resource and present it.
There appear to be perhaps multiple issues at play. After renaming the directory to prairie.skybox for this example - when the older Xcodeproj is built and run from Xcode 16, there's a different error reported - although the resource is correctly embedded:
Unable to load resource: Reality File version 11 is not supported. (Latest supported version is 9.)
It is all working - regardless of the name of the skybox directory - when built and run from Xcode 15.4.
After putting together the project, and thinking "Oh, I should try the released Xcode in case this is a regression...", I found that it is, in fact, a regression in Xcode.
I've updated the sample project - which now has a separate project built with Xcode 15.4, alongside the original which I started with Xcode 16b5 (and doesn't open in 15.4 because of that). When the same code and setup is built and run with Xcode 15.4, it works perfectly.
The project created in Xcode 15.4 compiles and runs correctly when built with Xcode 16 beta 5, but the project created in Xcode 16 beta 5 doesn't set up the correct structure or something, and doesn't embed the resource. (I've submitted feedback with the details : FB14860256)
Filed FB14532217
Based on what I read in https://developer.apple.com/documentation/RealityKit/validating-usd-files, the APIs that Apple provides don't (currently?) provide any support for writing out USD files, and specifically for variants - the tooling of the various renderers allows you to select between variants, but it expects the variants to already be defined within the USD file.
There is a Entity.write(to: URL) capability that is new this year, but it creates a proprietary .reality format that isn't a stock USD(a/z) file. Based on my digging, the .reality file is a possibly a super-set of USD, because when it was created, the analog to capabilities needed in RealityKit didn't exist in the USD spec. Which may be (I suspect) still the case today.
I haven't yet pulled down the betas to explore the new RealityView for macOS and iOS, but previously created https://github.com/heckj/CameraControlARView for pretty much this kind of purpose. The views have been exposed on macOS to let you render things, but YOU have to set up and control the camera.
Even my little library doesn't have the game controller support built into it, but if you dig through the code a smidge, you'll see where it could be added - so I fully expect that the newer RealityView setup would allow the same. You'll need to provide the game controller inputs to control the camera positioning, rotation, etc.
(https://github.com/heckj/CameraControlARView was knocked together initially to render RealityKit scenes on macOS so that I could rotate and take image snapshots of the rotation to build into simple animated gifs, in order to add that into some online documentation)
SceneKit hasn't been deprecated, so I'm sure the official response would be along the lines of "it's up to you", but RealityKit is (to my watching) where the updates are coming in and what's getting used. RealityView is now also available on macOS (although I haven't downloaded the new beta to try it out) - setting up a RealityKit view on macOS was previously a bit of challenge. That said, this year brings even more options for programmatic (or procedural) generation of meshes and textures that should make visualizations as you describe far more accessible.
Look into LowLevelMesh for the raw mesh generation types. There's equivalent bits for textures.
Your question about the sample code kicked me in the right direction. The error message is a complete red-herring. I was also doing some fiddling to make my peer to peer connection instance async/await, and in the listener setup, I did something that didn't fully enable the state update handler, so I never heard the message of it coming live, and I'd based a bunch of work off that. I re-worked the code back to using a closure for capturing and reacting to that state, and that sorted the issue entirely.
I also learned that it's apparently acceptable to queue a single call to receive() even before the state is ready, that callback just won't be called until the state is ready. I wasn't 100% sure of the constraints expected by Network.NWConnection between registering sends() and or receive() callbacks and the state reported by the framework. After I backed off the idea of "wait until it's ready before doing ANYTHING" with async calls, the whole thing worked a lot more smoothly.
(I had been setting up an AsyncStream queue with a continuation to handle the state update callbacks, but it ended up before a lot more ceremony without a whole lot of value - I do still want to wrangle something in there that's async aware at a higher level to allow initiated connections to retry on failure, but I'm in a way better space for that now.)
Thanks Quinn,
Yeah, I killed the watchOS companion bits and ran it without issue. I'll go through with a fine-toothed comb and see if I can spot the (any) difference(s) in how it's set up vs anything I might have incorrectly changed.
The logs from the app also show that same error - so I'm begininng to think that the message Unable to extract cached certificates from the SSL_SESSION object a red herring:
boringssl_session_set_peer_verification_state_from_session(448) [C1.1.1.1:2][0x106208510] Unable to extract cached certificates from the SSL_SESSION object
[C1 connected joe._tictactoe._tcp.local. tcp, tls, attribution: developer, path satisfied (Path is satisfied), viable, interface: en8] established
The effect is similar because the two types are closely. RealityView being a top-level view that renders 3D content (using the RealityKit renderer) and provides RealityViewContent as "context" - a means to manipulate any of the entities, adding, removing, updating them, etc. If you were to use RealityView by itself, you're responsible for loading models, updating the situating the entities so they work as you expect, etc.
ModelView, by comparison, reads as those it's meant to the "fast path" to getting a model loaded and visible, without providing all the open context to manipulate the RealityKit renderer and entities. Model3D isn't exposing the context and doesn't provide a path to adjust, manipulate, and so on - any of the models you're loading and displaying.
The article Adding 3D content to your app covers both of them with snippets that might make it easier to see in context.
RealityKit "runs" on macOS, but it doesn't use the camera for any input - you have to create your own virtual camera and arrange all the positioning yourself. I made a SwiftUI view wrapped (for macOS) that lets me see RealityKit models and representations that you're welcome to dig through and take what you'd like from it: https://github.com/heckj/CameraControlARView - although if you use it as a package, make sure to use the release version, because I've got it somewhat torn apart working on adding some different camera motion modes to the original creation.
I’ve been hitting the same questions and quandaries - and like @ThiloJaeggi , the only consistent pattern i’ve managed is recreating shader effects in RealityComposer Pro and applying them there. Annoying AF after the effort expended in Blenders shader nodes, but I do get the complications with export.
Related to that, there is working pending to export blender shader nodes (or more specifically, some subset of them) as MaterialX, but as far as I can see it’s stalled in discussions of how to handle this inside blender eve. it comes to their internal renderers (eve and cycles), which currently don’t have materialX support.
I’m just starting to slowly increment through nodes to experiment with what can and can’t be exported, as I’d really prefer to use Blender as my “DCC” tool of choice.
I found a solution, although I had to go tweaking my origin data to enable it.
I switched the percentiles reported out of my data structure so that they represented 1.0 - the_original_percentile to get numbers into a log range that could be represented. Following that, the keys were figuring out chartXScale and chartXAxis with a closure to tweak the values presented as labels using AxisValueLabel.
For anyone else trying to make a service latency review chart and wanting to view the outliers at wacky high percentiles, here's the gist:
Chart {
ForEach(Converter.invertedPercentileArray(histogram), id: \.0) { stat in
LineMark(
x: .value("percentile", stat.0),
y: .value("value", stat.1)
)
// Use curved line to join points
.interpolationMethod(.monotone)
}
}
.chartXScale(
domain: .automatic(includesZero: false, reversed: true),
type: .log
)
.chartXAxis {
AxisMarks(values: Converter.invertedPercentileArray(histogram).map{ $0.0 }
) { value in
AxisGridLine()
AxisValueLabel(centered: true, anchor: .top) {
if let invertedPercentile = value.as(Double.self) {
Text("\( ( (1 - invertedPercentile)*100.0).formatted(.number.precision(.significantDigits(1...5)))) ")
}
}
}
}
Although of note - at least in Xcode Version 14.3 beta 2 (14E5207e), using "reversed:true" in the ScaleDomain caused the simulator to crash. (filed as FB12035575)
In a macOS app, there wasn't any issue
Resulting chart:
Version 22.08 of the USD tools dropped with explicit native M1 support, although there's some quirks IF you're installing it over a version of python that was set up and installed using Conda.
I wrote a blog post with the details at https://rhonabwy.com/2022/08/16/native-support-for-usd-tools-on-an-m1-mac/, but the core of the issue is that an additional build argument is needed: PXR_PY_UNDEFINED_DYNAMIC_LOOKUP=ON which you can pass through their build tooling:
python build_scripts/build_usd.py /opt/local/USD --build-args USD,"-DPXR_PY_UNDEFINED_DYNAMIC_LOOKUP=ON"
The details, and very helpful explanation for why this is required, is covered in the USD GitHub issue https://github.com/PixarAnimationStudios/USD/issues/1996