Not exactly. I thought you might know of a new api to talk to the Xcode console.
I’m just trying to pass a dso handle from existing log macros to a central existing C++ logger and get the console to print file/line. Seems simple enough if these calls don’t take file/line/func explicitly. Apparently these calls use the return address to find the caller, and don’t allow that to be passed down.
That’s why the calls have to be injected at the call site. I’m not sure what the dso_handle is for, but must be to quick filter out libraries that Apple wants to hide logs from. It’s just a mach_header which is only .o file and not line specific.
This log system was always built for Apple to use internally, but pretty much impractical to use by anyone else. But it’s all I have access to. But I’ll file something.
Post
Replies
Boosts
Views
Activity
Hey Quinn,
That’s not the way apps/reality works. Okay now my log is 4K characters, and os_log truncates at 1k. Or I need to route errors/warnings to a FILE*. Or I need to filter out specific messages. Or I use fmod or any number of commercial game libraries, and they just supply a callback hook with level/file/line/func/message. Or I need to print thread name and suppress the os_log output to console that only has TID:PID both of which are not human friendly.
We can’t just inject os log macros throughout all this code. Every system already has its own logging system and os_log thwarts routing logs. os_log even reports the call location and timestamp which isn’t passed down.
See android_log_message for a way easier and more flexible log call. A single code point under a mutex can hand it level, group, file, line, message. We’ll just stick with printf, but is there someway we can feed the console all the log fields we have without going through the severely limited os_log?
Just to clarify the os_log_with_type works from a .cpp, .m, or. mm file. But if you try to pass a dso handle from a .m, .mm, or .cpp file that is not Logger.cpp or Logger.mm calling my_os_log_with_type, then that &__dso_handle has some sort of protection that doesn't generate any file/line in the new Console.
You might be able convert gltf to usd using some Apple content tools AFAIK. ModelIO is what you want to use for loading or creating models. But ModelIO removed fbx support, so now you have to use tools to convert fbx to usd which is quite lossy (f.e. losing all textures, etc). Or there are open-source fbx processors.
MTKView uses CA/NSDisplayLink, but visionOS is driving the compositor. There's little point too, since you need the drawables from vision (side-by-side or layered).
Are you recording to a valid format that is hardware supported (HVEC or ProRes)? I also couldn't find anyway on macOS screen recording to record HDR content.
That link is not sufficient. It doesn't deal with resizing the view. I can disable UIRequiresFullscreen to turn the app into a multitasking app. Then I get the window resizing running on macOS M2. The app just draws at the same res, and the grid does resize to the window boundary. It's just downsampling the grid, and it looks terrible. We need a sample showing how to handle resizing to actual window boundaries, so an iOSApponMacOS operates like a real macOS app. There are still bugs with HDR support always reporting 1.0, but it's simpler than dealing with Catalyst.
None of the threads mentioned above help. We just need sample code that shows an iOS on macOS app resizing. There is none. Even the sample showing the UISupportsTrueScreenSizeOnMac and UILaunchToFullScreenByDefaultOnMac keys is set to UIRequiresFullscreen. Turning that off and making sure the orients are all specified allows the window to resize, but the MTKView in a game needs to respond to that. UIWindowScene requires iOS 13, isFullscreen flag requires iOS 16, isiOSAppOnMac requires iOS 14. So this is a minefield of trying to get everything right. This needs a sample app, and not references to pages that don't have details.
Having an iOS app switch from fullscreen to windowed and maintain the same resolution regardless of the window size is not a good experience. The UI becomes distorted and the size too small. Dragging from landscape monitor to portrait monitors results in letterboxing instead of filling the display. This needs to be the default behavior for iOS on macOS apps.
Android has phased out ES for Vulkan. Nintendo uses Vulkan or lower-level APIs. Windows only has GL 1.1 support, unless you use IHV installs. Linux is emulating DX and Vulkan with Proton. Metal and DX11 are similar in complexity and power, with Metal being a very straightforward way to use tile-based gpus. OpenGL is dead even if there are stragglers. Let me know when there's a GL 4.7 or an ES 3.3 if you think what I'm saying is nonsense.
OpenGL global state and having to pepper code with glGetError() in WebGL/GL is tedious. There have been many OpenGL implementation on consoles, and no one uses them due to the hidden state.
You have to do your own frame pacing on the present call. See the WWDC 21 presentation all about this. Also the display link fps settings, currentRenderPassDescriptor/currentDrawable calls are all stalls too.
The fullscreen requirement for iOS on macOS is if an app doesn't support all orientations and multitasking. It's a little weird to have the windowed and fullscreen render the same dimensions, since it means text is unreadable in our app when windowed, and there is downscaling of the pixels. So I'll see if we can disable that so that we get a resizable window. But all the remaining issues above still need solutions.
My iPad Air 3 reports 2x for UIScreen.potentialEDRHeadroom. So then I activate HDR mode. I'd like to do the same for the app running atop an M2.
I don't understand, if you are doing ray-tracing then there are BLAS and TLAS BVH structures that wrap rigid models, and allow the ray to quickly hit triangles and then resolve attributes. The only limitation there is that all positions must be in a single VB.
Apple has always had VK_EXT_shader_object, it's Vulkan that locked everything down into rigid PSO. I would assume like ray-tracing, that's the ability to splice in bits of code.
For lack of anything better, I'll assume detecting ProMotion on iOS is UIScreen.maximumRefreshRate > 80. But I'd like to know I have ProMotion even when it's throttled to 60Hz.
Also seems that there is no detect for the "fullscreen" hack of running iOS app on macOS. The app starts up at the full res of the display, and stays that resolution even when you drop out of fullscreen mode (fn+F). The windowed mode can't really be resized, so this must be the tradeoff for this mode, and it beats adopting MacCatalyst.
So we're rendering a lot more pixels than needed in windowed mode, and can't tell fullscreen vs. not by comparing the UIScreen rect and UIView size. Maybe I can test the UIWindow size instead, since that seems smaller than the view res in this case.
I'll probably have to submit this as a precious ticket to get any response to these issues.