I have a question for Apple Staff from a Java platform software engineer. Java is a huge platform for server software development, and a lot of us engineers enjoying using Macs to code for the Java platform.
That being said, what is the future of the JDK on the Apple SoC? Will there be a native version? Is the current x86_64 Mac JDK fully compatible with the Apple SoC?
Thanks!
Post
Replies
Boosts
Views
Activity
Hello,
I've been using the property CIContextOption.name for some time now, including up to XC12 beta 6. Now my code that references it won't compile with XC12 GM. The compiler complains it can't find that member. I'll file this as a bug but want to throw this out here on the forums, too.
Cheers.
Michael
I encountered a very similar bug and I found a workaround. Can you try context.heifRepresentation(of: image.settingAlphaOne(in: image.extent), …) and see if that works?
That worked, thanks, though in my particular case I was working on macOS 11.3.
I used to be able to see source diffs in the editor with Xcode. Now the "Enable Code Review" button doesn't do anything when I click it. I'm running Xcode 13 beta 13A5154h and macOS 11.4 20F71. Can someone help?
Hi,
According to the API documentation, the activeKeys property of the CIRAWFilterOption struct is deprecated. What is its replacement functionality?
Cheers.
Michael
I'm running into hard crashes (EXC_BAD_ACCESS) when calling CIContext.writeHEIF10Representation or CIContext.heif10Representation from multiple threads. By contrast, concurrent access to writeHEIFRepresentation works fine.
Does anyone know any other way to write a CIImage to 10-bit HEIF? I've tried several alternatives using CVPixelBuffer and MTLTexture without success.
While I've filed a report through Feedback Assistant, I'm looking for a workaround. Writing thousands of 10-but hdr heif images within a single thread is an absolute throughput killer, whereas I can write any other image format without concurrency issues.
Thanks!
(Context: macOS Ventura 13.3)
What is the simplest/least resource-intensive way to enable the display of HDR content in an NSImageView in an AppKit application?
I've found that, as the documentation for wantsExtendedDynamicRangeContent suggests, NSImageView faithfully presents HDR images (loaded from HDR formats such as 10-bit HEIC) whenever any on-screen CAMetalLayer has the aforementioned property to to true. This is so even if another app on the same screen has set wantsExtendedDynamicRangeContent to true on a CAMetalLayer. Conversely, if no such layer exists on-screen, the NSImageView clips HDR highlights.
For example, suppose you have an app with an NSImageView displaying a 10-bit HEIC image. And suppose you have Affinity Photo 2 with the "Enable EDR by default in 32bit RGB views" option enabled in the AP 2 preferences. If you load an image with Affinity Photo 2 (apparently any image, even an 8-bit JPEG), you will find that the image displayed by your NSImageView will render HDR highlights on an EDR-enabled display. You don't need to change any setting on NSImageView for this to work.
It seems rather awkward and inefficient to add something like a 1 or 0 pixel-sized CAMetalLayer to my app solely to enable EDR display. Since EDR content display is screen-based, it seems there should be some kind of property on NSScreen to enable EDR content. And clearly there's some kind of magic happening behind the scenes of NSImageView to display EDR content when EDR display is enabled on the screen.
Any advice?
Context: I'm running Xcode 14.3 on macOS 13.3.
I was browsing the Metal settings on one of my projects recently and noticed that the "Enable Fast Math" setting is set it "No" in "macOS Default". And in fact if I attempt to set it to "No" in one of my targets, that setting appears to be ignored. However, when I inspect the build log, it indicates that the Metal compiler is being run with the "-ffast-math" flag. Can someone else verify that they are seeing this as well? If so, this looks like a bug to me. And is this happening in a previous version of Xcode? I do not have (easy) access to a previous version.
I want to rotate 10 bit/component, 3 component RGB CGImages (or NSImages) by 90 degree angles. The images are loaded from 10 bpc heif files. This is for a Mac app, so I don't have access to UIKit.
My first thought is to use the Accelerate vImage framework. However, vImage does not seem to support 10 bpc images. In fact, I've tried this approach without success.
I know I can do this using the CIImage.oriented() method, but I don't want the substantial overhead of creating a CIImage from a CGImage and then rendering to a CGImage.
Any suggestions? Efficiency/speed are important. Thanks.
I haven't found any really thorough documentation or guidance on the use of CIRAWFilter.linearSpaceFilter. The API documentation calls it
An optional filter you can apply to the RAW image while it’s in linear space.
Can someone provide insight into what this means and what the linear space filter is useful for? When would we use this linear space filter instead of a filter on the output of CIRAWFilter?
Thank you.
I have a project which will not build on with XCode 16. I get a bunch of build errors like the following:
:1:9: note: in file included from :1:
#import "Headers/ExtensionFoundation.h"
^
/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX15.0.sdk/System/Library/Frameworks/ExtensionFoundation.framework/Headers/ExtensionFoundation.h:9:9: error: 'ExtensionFoundation/EXMacros.h' file not found
#import <ExtensionFoundation/EXMacros.h>
^
/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX15.0.sdk/System/Library/Frameworks/ExtensionFoundation.framework/Headers/ExtensionFoundation.h:9:9: note: did not find header 'EXMacros.h' in framework 'ExtensionFoundation' (loaded from '/System/Library/PrivateFrameworks')
#import <ExtensionFoundation/EXMacros.h>
^
:0: error: could not build Objective-C module 'ExtensionFoundation'
I'm stumped. Can anyone help?
I'm running Sequoia 15.0.