I'm looking at a case where a handler for NSWindowDidBecomeMain gets the NSWindow* from the notification object and verifies that window.isVisible == YES, window.windowNumber > 0 and window.screen != nil. However, window.windowNumber is missing from the array [NSWindow windowNumbersWithOptions: NSWindowNumberListAllSpaces] and from CGWindowListCopyWindowInfo( kCGWindowListOptionOnScreenOnly, kCGNullWindowID ), how can that be?
The window number is in the array returned by CGWindowListCopyWindowInfo( kCGWindowListOptionAll, kCGNullWindowID ).
I'm seeing this issue in macOS 15, maybe 14, but not 13.
Post
Replies
Boosts
Views
Activity
I'm getting a runtime assertion failure like this:
"<FFRender3DView 0x616000271580> has reached dealloc but still has a super view. Super views strongly reference their children, so this is being over-released, or has been over-released in the past."
Looking at the code, I can't see any strong reference to the view except by its superview, so I can't see how it could be released other than by removal from its superview. My first instinct was to override release and set a breakpoint there, but that's not possible in ARC code.
I have a Mac app with a background-only helper app that needs to have Accessibility permission in order to use an event tap that can modify events. This has worked OK through Sonoma, but in the Sequoia beta it is failing to create the tap. C code to test the ability to create the event tap:
static CGEventRef _Nullable DummyTap(CGEventTapProxy proxy, CGEventType type, CGEventRef event, void *userInfo)
{
return NULL;
}
static bool CanFilterEvents( void )
{
CFMachPortRef thePort = CGEventTapCreate(
kCGSessionEventTap,
kCGTailAppendEventTap,
kCGEventTapOptionDefault, // active filter, not passive listener
CGEventMaskBit(kCGEventKeyDown),
DummyTap,
NULL );
bool madeTap = (thePort != NULL);
if (madeTap)
{
CFMachPortInvalidate( thePort );
CFRelease( thePort );
}
return madeTap;
}
So, on Sequoia, CanFilterEvents returns false in spite of Accessibility permission being granted in System Settings. CGPreflightPostEventAccess also returns false, but AXIsProcessTrusted returns true.
I tried making a non-background-only test app, and when that has Accessibility permission, CanFilterEvents, CGPreflightPostEventAccess, and AXIsProcessTrusted all return true. Suggestions on what to try next?
I'm trying to do a piecemeal conversion of a big macOS Objective-C++ code base to use Automatic Reference Counting (ARC), and started with a fairly complex modal dialog. I converted all the classes involved to use ARC. When the dialog closes, the window itself, and some of the controller objects, get deallocated as they should, but some do not. When I look at the memory debugging graph in Xcode, I see a bunch of things of the form NSKVONotifying_MyClassName. Here's an example:
It does not look as though any of my objects have strong references to GRMorphController, so what am I to make of this?
I was my understanding that you're supposed to be able to open a .ips crash log in Xcode and see pretty much what you would see if the app had been running in the debugger when it crashed. But the addresses in my app don't get symbolicated. I opened the .ips in the same project and same version of Xcode that was used to create the app. The .dSym file is around, and I can use it to symbolicate using the atos tool. What am I missing?
I would like to install the beta of macOS 15 on an empty volume, rather than on top of an existing version of macOS. Is it possible? I see that I can download an .ipsw file, but I don't understand what can be done with it.
When making a disk image for software distribution, it used to be possible to make a Finder window automatically open when the disk image is mounted, using a command like
sudo bless --folder dirPath --openfolder dirPath
on a read-write disk image.
However, as of Ventura, attempting to do so produces an error message
bless: The 'openfolder' option is deprecated
and the command fails to do what I want.
Disk images that were set up this way in years past continue to work. I suppose I could duplicate a working writable disk image, remove the old contents and put in new contents, but that seems a little hacky. Is there an alternative?
As an exercise in learning Swift, I rewrote a toy C++ command line tool in Swift. After switching to an UnsafeRawBufferPointer in a critical part of the code, the Release build of the Swift version was a little faster than the Release build of the C++ version. But the Debug build took around 700 times as long. I expect a Debug build to be somewhat slower, but by that much?
Here's the critical part of the code, a function that gets called many thousands of times. The two string parameters are always 5-letter words in plain ASCII (it's related to Wordle). By the way, if I change the loop ranges from 0..<5 to [0,1,2,3,4], then it runs about twice as fast in Debug, but twice as slow in Release.
func Score( trial: String, target: String ) -> Int
{
var score = 0
withUnsafeBytes(of: trial.utf8) { rawTrial in
withUnsafeBytes(of: target.utf8) { rawTarget in
for i in 0..<5
{
let trial_i = rawTrial[i];
if trial_i == rawTarget[i] // strong hit
{
score += kStrongScore
}
else // check for weak hit
{
for j in 0..<5
{
if j != i
{
let target_j = rawTarget[j];
if (trial_i == target_j) &&
(rawTrial[j] != target_j)
{
score += kWeakScore
break
}
}
}
}
}
}
}
return score
}
I have a repeating timer installed like this:
_cmdTimer = [NSTimer timerWithTimeInterval: 0.5
target: self
selector: @selector(timedTask:)
userInfo: nil
repeats: YES];
[NSRunLoop.mainRunLoop addTimer: _cmdTimer
forMode: NSModalPanelRunLoopMode];
[NSRunLoop.mainRunLoop addTimer: _cmdTimer
forMode: NSDefaultRunLoopMode];
The first time the timer fires, it opens a modal dialog. But then the timer does not fire again until the dialog is closed. I don't get that, since I scheduled the timer in NSModalPanelRunLoopMode. To verify that the dialog was running in that mode, just before opening the dialog I said
[self performSelector: @selector(testMe)
withObject: nil
afterDelay: 0.7
inModes: @[NSModalPanelRunLoopMode] ];
and the testMe method did get executed while the dialog was open.
Recently, when I open a document in my app, it just adds a blank line to the Open Recent submenu. Attempting to select that line produces an error alert saying "The document “(null)” could not be opened. The file doesn’t exist." However, the document does appear in the global Recent Items menu. I tried rebooting. I'm not subclassing NSDocumentController or doing anything weird about opening files. Ideas?
P.S. I tried logging in to a different account, and tried changing the bundle ID. Neither helped.
I have a shell script that turns a framework into a plain dylib and updates some dependent library paths using install_name_tool. It works, but if the framework was signed, I get warnings like:
install_name_tool: warning: changes being made to the file will invalidate the code signature in: [redacted].dylib (for architecture x86_64)
I thought I could get rid of the warning by adding
codesign --remove-signature dylib-path
to the script before using install_name_tool, but then I get errors like
install_name_tool: fatal error: file not in an order that can be processed (link edit information does not fill the __LINKEDIT segment): [redacted].dylib (for architecture x86_64)
Is there a way to fix this?
I have a view (custom subclass of NSView) that overrides mouseDown: and mouseUp:. If I double-click the view, I expect to see a sequence of events:
mouseDown with clickCount == 1
mouseUp with clickCount == 1
mouseDown with clickCount == 2
mouseUp with clickCount == 2
Usually, that's what happens. But occasionally, the second or both mouse up events don't arrive. That's a problem, because it's my understanding that if you want to handle a double-click, you should be looking for the second mouse up. I am certain that the mouse location is always within the bounds of the view. What could cause this? (Testing on macOS 13.6.4.)
Added: I use a subclass of NSApplication, and override nextEventMatchingMask:untilDate:inMode:dequeue: and sendEvent:. The overrides usually just call through to the superclass method. Logging mouse events from these methods, I see that in the problematic cases, the mouse up events are received from the queue, but never sent.
I want to be able to simulate mouse clicks, moves, and drags within my own app. I can do that using CGEventPost, but that requires accessibility permission, which seems silly to require when I'm not trying to control another app. An alternative is to create mouse NSEvents and use -[NSApplication postEvent:atStart:]. By that approach, I am able to click a button and watch it highlight and unhighlight, but the mouse cursor never moves and the pressedMouseButtons class property of NSEvent never changes. Is there a better way to simulate mouse events without requiring accessibility permission?
I replied to this thread a couple of hours ago, but the recent post list, sorted by Last Update, does not show it, and my watch list shows an update date based on a previous reply.
I have an overloaded function, one version with a BOOL parameter and one with a bool parameter. It gives me a redefinition error when compiling for Apple Silicon but not when compiling for Intel.