This is still bust even in macOS 14.5.
The initial return of batteryLevel seems to be fine, but when responding from a notification handler it fails and returns 0. So then our app displays the low battery indicator. I'm running iOS on macOS with an M2 Max.
Error retrieving battery status: result=-536870207 percent=-2093318143 hasExternalConnected=1 isCharging=5 isFullyCharged=184
p [[UIDevice currentDevice] batteryLevel]
(float) 0
Post
Replies
Boosts
Views
Activity
iPad apps even if you change the size of the Window are just rescaling to the window (still full res). Have to require iOS 13, and jump through some hoops to allow resizing the application window (and not always be full res). But that's iPadOS on macOS, and not iPhone.
This is confusing, since the Metal Feature Set table conveniently leaves A13 off the list of Metal 3 devices. So which is it? A13 has Tier2 argument buffers (indexing), but not barycentrics, RT or mesh shaders, etc.
This is still completely bust in MacOS 12, 13, and 14. Maybe even back as far as 11.0. It's more than just an error message. It breaks using ProMotion in Rosetta apps, and having any chance of reaching 120Hz.
CurrentVBLDelta returned 200,000 for display 1 -- ignoring unreasonable value
[0x7f85fd07ba20] Bad CurrentVBLDelta for display 1 is zero. defaulting to 60Hz. <-
0 fps is supposed to mean run at maximum refresh, not divide by 0 and compute an invalid vertical blank (f.e. 20000).
So can we get a resizable window with iOS on macOS? I'm confused by this statement. Currently we require fullscreen, and only support landscape. But if I turn off UIRequiresFullscreen and state we support all orients, I can get a resizable window. Some elements scale, and other do not. Does that mean that I need to respond to the UIWindowScene scaling, or is the main render always fullscreen size?
"When the window is resized, e.g. by using the Window->Zoom menu item, or by taking the window full screen, your fixed-size UIWindowScene is scaled up or down by macOS automatically, and your app does not find out about this."
Not for indirect, but for directly setting textures.
https://developer.apple.com/documentation/metal/mtlrendercommandencoder/1515842-setvertextexture?language=objc
Just don't put it in the struct. Output directly from the VS. I think that will fix it.
Seems that disabling "Settings -> General -> Show Live Issues" gets rid of the stale error reports in the Issue List. We build with makefiles, but can this feature be fixed? It really makes Xcode needlessly difficult to do development.
I mean zero of the Metal sample apps even set a colorspace. So what do we reference here? macOS and iOS are supposed to be the ultimate "color managed" platforms, although iOS couldn't afford the pixel ops on the first phones/iPads.
Am I tagging my pixels which are linear as srgb, do I need a transfer function to be gamma 1 or gamma 2.2? What is kCGColorSpaceSRGB vs. kCGColorSpaceLinearSRGB. These are both srgb colorimetry, but am I stating that my content has the srgb gamma curve in the former, and is linear in the latter? And how to get the cheapest pass through. I can set my display color space to sRGB (709) or P3, but how do I get the cheapest, fastest, and highest quality pixel output to screen. Before all the color management, on iOS I had to use BGRA8 instead of RGBA8 to avoid a swizzle blit. These are the kind of details that are sorely lacking even in the WWDC presentations.
There is also true tone, and the user adjusting the maximum brightness of the display. So those also need to be responded too. And what to do when NSApplicationDidChangeScreenParametersNotification calls and says screen.maximumExtendedDynamicRangeColorComponentValue changed.
Yes, we've got all that. But what are the correct settings for iOS and macOS. I've listed what I've tried so far above and failures. What have you tried that worked? On SO, someone stated that MTKView can only composite to srgb spaces (limitation of iOS and/or macOS?). We also can't use XR_srgb format except on the new macs, but are doing development on Intel.
Why do I need to set a color space at all if I'm already passing linear data in rgba16f. I'm assuming the rest of the UIKit/AppKit composite has some colorspace as well, not just my MTKView. So all those need to be in agreement.
So far that agreement seems to be that my content needs to be srgb (or DisplayP3 with similar srgb curve). Rec2020_PQ which is HDR10 gets even more complex with exrmetadata on the CAMetalLayer, but that's all that HDR televisions speak. But that's a lot of per pixel matrix ops, and a different compositing path for srgb8 UI and and rgba16f render.
We can create textures that are srgb or linear. The gpu hw will convert the srgb back to linear for us. But I've got linear data in the render texture that doesn't look correct with most of these color spaces. I haven't changed the data. And they are even incorrect in EDR (0 to 1), let alone HDR (<0, >1).
Thanks that works. I so rarely change these once they are set, and there's no help text. Contrast that with other IDEs that allow chording and this UI is unchanged from Xcode 1.0.
Thanks, I've used Xcode for a long time. So I'm familiar with the breakpoint UI. It's just that they shift as code changes, and then I like to delete them.
How do you delete a conflicting Xcode keyboard shortcut? I constantly have this issue. Cmd+\ doesn't work. There are no articles online about it. delete just enters the delete key as the keyboard shortcut. This shouldn't have to be this hard. The keyboard shortcut menu even has a red x, but you click on it and it doesn't delete the conflict.
Think I figured it out. It's because the natural thing to remove a keyboard shortcut is to hit delete, but this instead inserts the delete key as the key equivalent, and then breaks all delete functionality throughout Xcode. And there's no assistance to say how to erase a shortcut (does cmd+delete still work after that?). So now I have to redo all my shortcuts from scratch to get rid of the delete key override.
Are there really that many people that need to override the Delete key specifically?
Still happening even in Xcode 14.1. The filter isn't a solution to this. The errors displayed even when the build succeeds just cause confusion for the non-engineers trying to build our title.
I found the issue. Apparently Apple thought it would be fun on iOS to add and also enable "Private Wi-Fi Address" by default in the Wi-Fi settings . This replaces the DHCP address that our device needs to be on to connect back to macOS which needs to be on the same IP range. Ugh.