Posts

Post not yet marked as solved
0 Replies
117 Views
I have an app that requires to install itself as a folder along with a few support files. So I've defined a package component that installs itself somewhat like so: /Applications/MyApp/MyApp.app /Applications/MyApp/Foo /Applications/MyApp/Barr The application require those other files be in place to operate. In order to make the installer requirement hurt a bit less in a world of drag and drop single bundle Mac apps - I'd like the support relocation so the user can move this folder wherever the like. However - if I enable relocation - only the application bundle is relocated. The sidecar files continue installing to /Applications/MyApp. Is there any way to get the companion files to relocate along with the application bundle?
Posted Last updated
.
Post not yet marked as solved
0 Replies
99 Views
I'm loading in USD content through RealityKit on visionOS. I'm computing the bounds of the model and increasing the view and window size to match the USD size. Larger models are getting cut off. Through experimentation - it seems like volumetric windows have a maximum 2 meter size in any dimension. This seems reasonable - but is this expected behavior? If so - is it documented anywhere so I can guard against this case? I just want to make sure this is indeed true so I can stop troubleshooting my view hierarchy.
Posted Last updated
.
Post not yet marked as solved
1 Replies
1.3k Views
Currently we have our own home grown Xcode continuous integration farm with Intel and Apple Silicon Mac minis. We've been piloting Xcode Cloud, but have found much slower build times than we expected. We understand cloud services won't be as responsive as our dedicated Macs, but the times are much higher than expected. Most of our projects are Swift, but we have one React Native project that takes 40 minutes to build on Xcode Cloud. Dependencies (like Cocoapods and NPM) seem like a build time issue for that project specifically. We're worried how that could affect even our pure Swift projects with SPM dependencies. And build times generally also seem slow. The real issue besides the speed is that we're paying per hour for Xcode Cloud. And even if we decided to take the slower build times as a trade off to not having to maintain our own Macs, it seems like we'd quickly run out of time even on the highest tier plan. Is there any plan to maybe kick in some more machines or bandwidth into the Xcode Cloud backend? Even our React Native app (which seems to be the worst case) takes less than 10 minutes to build end to end locally.
Posted Last updated
.
Post not yet marked as solved
0 Replies
393 Views
According to the Metal Best Practices guide, on Mac: If the texture is initialized once by the CPU and accessed frequently by the GPU, initialize a source texture with a Managed mode and then blit its data into a destination texture with a Private mode. This is a common case for static textures, such as diffuse maps. This would seem to be the best practice for things like material textures in a game that are loaded once, and then exclusively used by the GPU. However - the guide isn't specific on how Apple Silicon should be treated. It does say: Some macOS devices feature integrated GPUs. In these devices, the driver optimizes the underlying architecture to support a discrete memory model. macOS Metal apps should always target a discrete memory model. The iOS guide still mentions private textures, but does not going into detail, and has less prescriptive language. I'm basically unsure if Apple Silicon Macs should follow the iOS performance guide or the macOS performance guide. The guide also has a last updated of 2017, implying that it may not have been updated for Apple Silicon. I'm not quite sure for static texture resources for games what the best path is. Apple Silicon has a single address space - which I assume would reduce the penalty for having a shared resource. However, private resources can be optimized for GPU use during a blit. It's been hinted that things like textures might be compressed in a way that they typically couldn't if CPU access needed to be maintained. Very possible that the guide hasn't been updated because the guidance is unchanged. But I wanted to check in since I'd assume iOS and macOS on Apple Silicon should be similar.
Posted Last updated
.
Post not yet marked as solved
0 Replies
721 Views
I'm working on a ReplayKit Upload Broadcast Extension on macOS 11. I've added the Broadcast Extension to the host Mac app. But I can't get the system or the host app to recognize the Broadcast Extension exists. The picker menu just says "No Broadcast Extension Installed." I've tried a few things that have normally kickstarted plugins in the past, like copying to the Applications folder or double clicking on the plug in itself. But nothing is working. Is there a trick to getting the system to register a Broadcast Extension? Maybe a command line tool like I've seen with Spotlight extensions? Thanks!
Posted Last updated
.