Post

Replies

Boosts

Views

Activity

Xcode 16's clang generates incorrect inlined optimized code
I'm encountering a bug with Xcode 16's clang's code generation. It seems to be generating incorrect optimized code (-O2 setting) when functions are inlined and then optimized. This is a behavior that didn't exist in Xcode 15, and also doesn't happen in open source clang (I tested open source clang 17/18/19 on my M1 Max MacBook Pro). The entire code snippet is slightly too long to post here so I'm including a link to the godbolt compiler explorer instead: https://godbolt.org/z/KhG3x7E1d . This piece of code attempts to find a sequence of illegal UTF-8 characters and report the index of the character in a string. Not that in godbolt, the program works fine and finishes correctly. When I test it in Xcode 16 though (with -O2), it doesn't, and utf_find_illegal returns 4 instead of 3, which is incorrect. Digging through the disassembly it seems to be doing some complicated optimizations by inlining both utf_ptr2len and utf_ptr2char together but it doesn't perform correctly and jumped to the wrong place when the illegal character was found. I did try to see if there are some undefined behaviors or something which caused the optimizer to go to town with the code. Funnily when I use UBSAN (by compiling the code with -O2 -fsanitize=undefined) the code works just fine and the bug doesn't happen. Wonder if other people have seen similar issues? Code generation bugs seem really dangerous considering that people rely on the compiler to… work. I tried to scrub the code to see if there are anything that could suggest the compiler to behave incorrectly but not having any luck as well. I have also tested Xcode 16.1 beta and it doesn't seem to help. Note: Yes, I know I'm supposed to use the Feedback Assistant but I have never received any responses on it even when filing legit bugs and their statuses are still open, with their bugs unfixed. Pardon me for not trusting it too much.
3
1
490
Oct ’24
How to query APFS file name size limit? (NAME_MAX is wrong)
In APFS, the file name size limit seems to be 255 UTF-8 characters. This is what Wikipedia says (https://en.wikipedia.org/wiki/Comparison_of_file_systems#Limits), and matches my own tests. However, I cannot seem to find a concrete way to query that in a C POSIX program, or even find the official documentation that says "APFS file name is 255 UTF-8 characters" as the Wikipedia article did not actually site its source. There is a syslimits.h, which defines NAME_MAX as follows:; #define NAME_MAX 255 /* max bytes in a file name */ This is extremely misleading (it's basically wrong) as APFS can handle UTF-8 characters, which probably means up to 4 * 255 = 1024 bytes. Is there an official API or compile-time constant that we can refer to? It's quite non-ideal to manually do a NAME_MAX * 4 especially when you are working on a cross-platform program that just queries NAME_MAX for bytes in say Linux. The current naive implementation of char file_name[NAME_MAX+1] will lead to buffer overflow when user has non-ASCII file names. (Edit: Obviously either way you should program defensively and check your buffer sizes regardless of whether NAME_MAX is correct)
9
0
1.6k
Mar ’23
How to get NSColorPanel to use specific color space?
NSColorPanel is the only OS-native color picker that I have seen that supports color space natively, and that's great. The only issue is that I can't seem to find any real way to control the color space used by the color picker, at all. The issue here is that I'm trying to present a color picker that starts off with an sRGB color, with all controls in sRGB mode. This way, an unsavvy developer (which is the majority of them) who don't understand color spaces won't get surprised when they picked a color and the resulting CSS hex code (CSS is essentially sRGB-only if you developer across different web browsers, sigh) looks different from their in-color-picker color. The only control I can see, is to pre-set the color with something like panel.color = [NSColor colorWithSRGBRed:1.0 green:0.5 blue:0.2 alpha:1.0];. This will pre-populate the color panel with an sRGB color. In the color panel "RGB" mode, it will actually be correct. However, if I go to the "wheel" mode, the wheel is still in Display P3 mode. If I select any color here the color will be automatically in P3, instead of in sRGB (you can bring up this menu by right-clicking on the wheel to see and change what the current color space for the wheel is): This seems like pretty poor behavior, as it makes it impossible to control the user experience, and it's not always desirable to get other color spaces when you get a color back. I can always manually convert the color when I retrieve it for my use, but it could lead to unintuitive behavior, especially if the user goes back and forth between the wheel and RGB modes. (It's unintuitive because the hex value they see will be different from the converted sRGB hex values) When playing around with Safari's handling of HTML input types (example: https://www.w3schools.com/jsref/tryit.asp?filename=tryjsref_color_get), you can actually see this behavior happening. If you go to the wheel, it will start selecting colors in Display P3, and then converts the color to sRGB when providing the color toe the "color" input in the web page. It's theoretically correct, but the user will see different hex values in the color picker from what the final HTML input sees. Interestingly Firefox handles it poorly by just using the selected color component values instead of doing a color conversion, meaning that both #aabbcc in P3 and in sRGB will end up producing #aabbcc in the final input value for the color component, which is wrong for P3 (because CSS assumes sRGB). Chrome is not affected as it does not use the builtin OS color picker, presumably to avoid oddities like this and to have a simplified UI. Anyone has any input? It really seems like the NSColorPanel needs to provide a little more control about how the color spaces should be handled. It's good that it's color space aware, but not good that we seemingly have no way to control the behavior.
1
0
793
Oct ’22
Xcode 14 and supporting macOS 10.9 - 10.12
So, in Xcode 14 beta, Apple removed support for deploying to macOS 10.9 - 10.12 You couldn't select those in the dropdown for minimum OS version, and if you attempt to force it by setting MACOSX_DEPLOYMENT_TARGET=10.9 in command line, you would get a warning that says this: warning: The macOS deployment target 'MACOSX_DEPLOYMENT_TARGET' is set to 10.9, but the range of supported deployment target versions is 10.13 to 13.1. Apple even mentioned this in their release notes (under Build System → Deprecations, tagged 92834476) and updated the table at https://developer.apple.com/support/xcode/ to refer to the fact that Xcode 14 only supports 10.13 and above. Now, Xcode 14 finally got out of beta and released. I downloaded it and guess what 10.9 - 10.12 are all in the list of OSes in the "Minimum Deployments" dropdown, and building with MACOSX_DEPLOYMENT_TARGET=10.9 no longer complains. Anyone knows what the deal is? Did Apple change their mind and lets us deploy to older versions again? Did the "unsupported" literally just mean a scary warning and UI change in Xcode to scare us away from deploying to those targets?
3
1
2.1k
Sep ’22