Posts

Post not yet marked as solved
1 Replies
222 Views
I'm in Europe, Vision Pro isn't available here yet. I'm a developer / designer, and I want to find out whether it's worthwhile to try and sell the idea of investing in a bunch of Vision Pro devices as well as in app development for it, to the people overseeing the budget for a project I'm part of. The project is broadly in an "industry" where several constraints apply, most of them are security and safety. So far, all the Vision Pro discussion I've seen is about consumer-level media consumption and tippy-tappy-app-stuff for a broad user base. Now, the hardware and the OS features and SDK definitely look like professional niche use cases are possible. But some features, such as SharePlay, will for example require an Apple ID and internet connection (I guess?). This for example is a strict nope in my case, for security reasons. I'd like to start a discussion of what works and what doesn't work, outside the realm of watching Disney+ in your condo. Potentially, this device has several marks ticked with regards to incredibly useful features in general. very good indoor tracking pass through with good fidelity hands free operation The first point especially, is kind of a really big deal, and for me, the biggest open question. I have multiple make or break questions with regard to this. (These features are not available in the simulator) For sake of argument, lets say the app I'm building is Cave Mapper. it's meant to be used by archeologists inside a cave system where we have no internet, no reliable compass, and no GPS. We have a local network that we can carry around though. We can also bring lights. One feature of the app is to build out a catalog of cave paintings and store them in a database. The archeologist wants to walk around, look at a cave painting, and tap on it to capture its position relative to the cave entrance. The next day, another archeologist may work inside the same cave, and they would want to have synchronised access to the same spatial data from the day before. For that: How good, precise, reliable, stable is the indoor tracking really? Hyped reviewers said it's rock solid, others have said it can drift. How well do the persistent WorldAnchor objects work? How well do they work when you're in a concrete bunker or a cave without GPS? Can I somehow share a world anchor with another user? is it possible to sync the ARKit map that one device has built, with another device? Other showstoppers? in case you cannot share your mapped world or world anchors: How solid is the tracking of an ImageAnchor (which we could physically nail to the cave entrance to use as a shared positional / rotational reference) Other, practical stuff: can you wear Vision Pro with a safety helmet? does it work with gloves?
Posted
by jpenca.
Last updated
.
Post not yet marked as solved
12 Replies
1.1k Views
hello, when I do Metal drawing into an MTKView in full screen, there's an issue with frame scheduling, it seems. There is visible stutter, and the Metal HUD shows the frame rate jittering about. happens in Ventura and Sonoma b2 on my Macbook Pro. here's a really minimal example. Not even actively drawing anything, just presenting the drawable. #import "ViewController.h" #import <Metal/Metal.h> #import <MetalKit/MetalKit.h> @interface ViewController() <MTKViewDelegate> @property (nonatomic, weak) IBOutlet MTKView *mtkView; @property (nonatomic) id<MTLDevice> device; @property (nonatomic) id<MTLCommandQueue> commandQueue; @end @implementation ViewController - (void)viewDidLoad { [super viewDidLoad]; _device = MTLCreateSystemDefaultDevice(); _mtkView.device = _device; _mtkView.delegate = self; _commandQueue = [_device newCommandQueue]; } - (void)drawInMTKView:(MTKView *)view { MTLRenderPassDescriptor *viewRPD = view.currentRenderPassDescriptor; if(viewRPD) { id<MTLCommandBuffer> commandBuffer = [_commandQueue commandBuffer]; [commandBuffer presentDrawable:view.currentDrawable]; [commandBuffer commit]; } } - (void)mtkView:(MTKView *)view drawableSizeWillChange:(CGSize)size { NSLog(@"%@", NSStringFromSize(size)); } Looks like there's some collision between display and render timer, or something. what gives? I would like to be able to render stutter free on this very nice machine? how would I go about that?
Posted
by jpenca.
Last updated
.
Post not yet marked as solved
0 Replies
1.1k Views
I'm trying to use the new Object Capture API with my Intel machine that has a a big fat external GPU. Unfortunately, the API does not allow GPU selection. Now, you used to be able to hit CMD+I on an app and check "Prefer external GPU" -- this seems to have been a working workaround for PeterKolsky here, for this exact specific case: https://developer.apple.com/forums/thread/681861?answerId=677919022 (scroll down to PeterKolski's comment --- this site does not allow a link to that answer directly for reasons) Now, what happened to that prefer eGPU checkbox? I don't see it. It was there in Big Sur, but not in Monterey. How do I tell an app to prefer the external GPU in macOS 12? can't find anything about this change in the beta release notes. thanks
Posted
by jpenca.
Last updated
.
Post marked as solved
1 Replies
2.1k Views
Hi,Xcode's Metal shader debugger hasn't been working for me for a while now. I think it started around Catalina beta, then worked for a day or two at some point, and then again not. It is simply dysfunctional and I have no clue what's up with that.Xcode can capture frames, show texture data, commandbuffer graphs etc, but when you proceed to debug a shader (click the bug icon somewhere) it pops an error in a popup:Error DYPShaderDebuggerErrorDomain:1: Unable to find library source.Library source required. Under the target's Build Settings, ensure the Metal Compiler Build Options produces debugging information and includes source code. If building with the 'metal' command line tool, add the options '-gline-tables-only' and '-MO' to your compilation step.(can't attach a screenshot here, does this forum run on iOS 13?)The error popup suggests to make sure Metal debugging flags are enabled in the target build settings, of course these are enabled (they are enabled by default).It doesn't matter which project I open, including Metal sample code from Apple.is this maybe just my machine? some stray config files somewhere, breaking things? Does the Metal Shader Debugger work for you?what gives?mac mini 2018, latest macOS / Xcode etc.
Posted
by jpenca.
Last updated
.
Post not yet marked as solved
1 Replies
800 Views
Hi,since a recent beta build (currently Xcode 11A1027 and macOS 19A578c) Xcode is not able to start Metal debuger sessions. Error says it cannot find the metal library source files:Error DYPShaderDebuggerErrorDomain:1: Unable to find library sourceIs this just me? I've tried with a trivial project which has only one .metal file, and also seeing it with Apple demo code (hello triangle). I've tried to add the suggested '-gline-tables-only' and '-MO' metal compiler flags to no effect.
Posted
by jpenca.
Last updated
.