I have been getting crash reports from users of my Mac app on Sonoma 14.0 and 14.1 when typing into an NSTextView subclass. The crash logs I have show involvement of the spell-checking system - NSTestCheckingController, NSSpellChecker, and NSCorrectionPanel. The crash is because of an exception being thrown. The throwing method is either [NSString getParagraphStart:end:contentsEnd:forRange:] or [NSTextStorage ensureAttributesAreFixedInRange:].
I have not yet reproduced the crash. I have tried modifying the reference finding process to simply link every word, via NSStringEnumerationByWords.
The text view in question recognizes certain things in the entered text and adds hyperlinks to the text while the user is typing. It re-parses and re-adds the links on every key press (via overriding the didChangeText method), on a background thread.
From user reports, I have learned that:
The crash only occurs on macOS 14.0 and 14.1, not on previous versions
The call stack always involves the spell checker, and sometimes involves adding recognized links to the text storage (the call to DispatchQueue.main.async in the code below)
The crash stops happening if the user turns off the system spell checker in System Settings -> Keyboard -> Edit on an Input Source -> Correct Spelling Automatically switch
The crash does not happen when there are no links in the text view.
Here is the relevant code:
extension NSMutableAttributedString {
func batchUpdates(_ updates: () -> ()) {
self.beginEditing()
updates()
self.endEditing()
}
}
class MyTextView : NSTextView {
func didChangeText() {
super.didChangeText()
findReferences()
}
var parseToken: CancelationToken? = nil
let parseQueue = DispatchQueue(label: "com.myapp.ref_parser")
private func findReferences() {
guard let storage = self.textStorage else { return }
self.parseToken?.requestCancel()
let token = CancelationToken()
self.parseToken = token
let text = storage.string
self.parseQueue.async {
if token.cancelRequested { return }
let refs = RefParser.findReferences(inText: text, cancelationToken: token)
DispatchQueue.main.async {
if !token.cancelRequested {
storage.batchUpdates {
var linkRanges: [NSRange] = []
storage.enumerateAttribute(.link, in: NSRange(location: 0, length: storage.length)) { linkValue, linkRange, stop in
if let linkUrl = linkValue as? NSURL {
linkRanges.append(linkRange)
}
}
for rng in linkRanges {
storage.removeAttribute(.link, range: rng)
}
for r in refs {
storage.addAttribute(.link, value: r.url, range: r.range)
}
}
self.verseParseToken = nil
}
}
}
}
}
I've filed this as FB13306015 if any engineers see this. Can anyone
AppKit
RSS for tagConstruct and manage a graphical, event-driven user interface for your macOS app using AppKit.
Posts under AppKit tag
196 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
as I open the pop-up menu and move the mouse before that opened, MouseEntered Event and MouseExited Event are called when mouse moved.
The following trackingAreas options are inclued in the view in pop-up area.
NSTrackingInVisibleRect, NSTrackingMouseEnteredAndExited, NSTrackingMouseMoved, NSTrackingActiveInKeyWindow
LocationInWindow of MouseExitedEvent seem to be incorrect.
This problems does not occur in the following cases.
Do not move the mouse until the popup is fully opened.
Left mouse button down on pop-up area.
Move the mouse out of the pop-up area.
This issue occurs in Sonoma(MacOS14.0) and later.
I would like to know if this is a code issue or a bug in the OS Version.
AppDelegate.h
#import <Cocoa/Cocoa.h>
@interface ViewInPopup : NSView {
NSString* resultStr;
NSUInteger enteredCount;
NSPoint lastEnteredPos;
NSUInteger exitedCount;
NSPoint lastExitedPos;
NSUInteger movedCount;
NSPoint lastMovedPos;
NSTrackingArea* trackingArea;
}
@end
@interface AppDelegate : NSObject <NSApplicationDelegate> {
NSMenu* myMenu;
ViewInPopup* viewInPopup;
}
- (IBAction)onClickButton:(id)sender;
@end
AppDelegate.mm
#import "AppDelegate.h"
@interface ViewInPopup ()
- (void)showResult:(NSEvent*)event;
@end
@implementation ViewInPopup
- (id)initWithFrame:(NSRect)frameRect
{
self = [super initWithFrame:frameRect];
[self setWantsLayer:TRUE];
[[self layer] setBackgroundColor:[NSColor redColor].CGColor];
return self;
}
- (void)drawRect:(NSRect)dirtyRect
{
[super drawRect:dirtyRect];
[resultStr drawInRect:[self bounds] withAttributes:nil];
}
- (void)updateTrackingAreas
{
if (trackingArea) {
[self removeTrackingArea:trackingArea];
}
NSTrackingAreaOptions options = NSTrackingInVisibleRect | NSTrackingMouseEnteredAndExited | NSTrackingMouseMoved | NSTrackingActiveInKeyWindow;
trackingArea = [[NSTrackingArea alloc] initWithRect:[self bounds] options:options owner:self userInfo:nil];
[self addTrackingArea:trackingArea];
[super updateTrackingAreas];
}
- (void)mouseEntered:(NSEvent *)event
{
[self showResult:event];
[super mouseEntered:event];
}
- (void)mouseExited:(NSEvent *)event
{
[self showResult:event];
[super mouseExited:event];
}
- (void)mouseMoved:(NSEvent *)event
{
[self showResult:event];
[super mouseMoved:event];
}
- (void)showResult:(NSEvent*)event
{
NSString* eventTypeStr = @"";
switch (event.type) {
case NSEventTypeMouseEntered:
eventTypeStr = @"Entered";
[[self layer] setBackgroundColor:[NSColor redColor].CGColor];
if (enteredCount >= NSUIntegerMax) {
enteredCount = 0;
} else {
enteredCount++;
}
lastEnteredPos = event.locationInWindow;
break;
case NSEventTypeMouseExited:
eventTypeStr = @"Exited";
[[self layer] setBackgroundColor:[NSColor blueColor].CGColor];
if (exitedCount >= NSUIntegerMax) {
exitedCount = 0;
} else {
exitedCount++;
}
lastExitedPos = event.locationInWindow;
break;
case NSEventTypeMouseMoved:
eventTypeStr = @"Moved";
[[self layer] setBackgroundColor:[NSColor greenColor].CGColor];
if (movedCount >= NSUIntegerMax) {
movedCount = 0;
} else {
movedCount++;
}
lastMovedPos = event.locationInWindow;
break;
default:
return;
}
resultStr = [NSString stringWithFormat:@"LastEventType:%@\n\nEnteredCount:%ld\nLastEnteredPosition:(%f, %f)\n\nExitedCount:%ld\nLastExitedPosition:(%f %f)\n\nMovedCount:%ld\nLastMovedPosition:(%f, %f)", eventTypeStr, enteredCount, lastEnteredPos.x, lastEnteredPos.y, exitedCount, lastExitedPos.x, lastExitedPos.y, movedCount, lastMovedPos.x, lastMovedPos.y];
[self setNeedsDisplay:YES];
}
@end
@interface AppDelegate ()
@property (strong) IBOutlet NSWindow *window;
@end
@implementation AppDelegate
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification {
// Insert code here to initialize your application
myMenu = [[NSMenu alloc] init];
NSMenuItem* item = [[NSMenuItem alloc] init];
[myMenu addItem:item];
viewInPopup = [[ViewInPopup alloc] initWithFrame:NSMakeRect(0, 0, 300, 300)];
[item setView:viewInPopup];
}
- (void)applicationWillTerminate:(NSNotification *)aNotification {
// Insert code here to tear down your application
}
- (BOOL)applicationSupportsSecureRestorableState:(NSApplication *)app {
return YES;
}
- (IBAction)onClickButton:(id)sender
{
[myMenu popUpMenuPositioningItem:nil atLocation:NSZeroPoint inView:(NSView*)sender];
}
@end
Hello,
I am trying to simulate a keystroke inside a macOS application.
Here is what i've done:
let src = CGEventSource(stateID: CGEventSourceStateID.hidSystemState)
let cmd_down = CGEvent(keyboardEventSource: src, virtualKey: 0x38, keyDown: true)
let cmd_up = CGEvent(keyboardEventSource: src, virtualKey: 0x38, keyDown: false)
cmd_down?.post(tap: .cghidEventTap)
cmd_up?.post(tap: .cghidEventTap)
macOS is asking me to allow my application on TCC accessibility. This is a global privilege and needs admin rights.
And i want to avoid that.
Is there an alternative to simulate a key stroke inside my application ?
Thanks
This function on NSTextLayoutManager has the following signature
func enumerateTextSegments(
in textRange: NSTextRange,
type: NSTextLayoutManager.SegmentType,
options: NSTextLayoutManager.SegmentOptions = [],
using block: (NSTextRange?, CGRect, CGFloat, NSTextContainer) -> Bool
)
The documentation here doesn't define what the CGRect and CGFloat passed to block are. However, looking at sample code Using TextKit2 To Interact With Text, they seem to be the frame for the textsegment and baselineposition respectively.
But, the textSegmentFrame seems to start at origin.x = 5.0 when text is empty. Is this some starting offset for text segments? I don't seem to be able to find mention of this anywhere.
Using the screencapture CLI on macOS Sonoma 14.0 (23A344) results in a 72dpi image file, no matter if it was captured on a retina display or not.
For example, using
screencapture -i ~/Desktop/test.png in Terminal lets me create a selective screenshot, but the resulting file does not contain any DPI metadata (checked using mdls ~/Desktop/test.png), nor does the image itself have the correct DPI information (should be 144, but it's always 72; checked using Preview.app).
I noticed a (new?) flag option, -r, for which the documentation states:
-r Do not add screen dpi meta data to captured file.
Is that flag somehow automatically set? Setting it myself makes no difference and obviously results in a no-dpi-in-metadata and wrong-dpi-in-image file.
The only two ways I got the correct DPI information in a resulting image file was using the default options (forced by -p): screencapture -i -p, and by making the capture go to the clipboard screencapture -i -c. Sadly, I can't use those in my case.
Feedback filed: FB13208235
I'd appreciate any pointers,
Matthias
Our MacOS application has a single window which is occupied by an NSView-derived view. It's been working for the last ten years or so, but when using the Sonoma beta, window updates are badly broken.
We rely on using setNeedsDisplayInRect to redisplay any portions of the view that need to be redisplayed, but no matter how small a rectangle we specify, the entire window is repainted with the background colour before our drawRect implementation is called. We already provide an overload of isOpaque in our view which returns true, and in the past this was effective for suppressing the background fill, but it no longer seems to work (although I can confirm that it is still called along the way).
I've attached an image that shows an example of how a sample window looks after resizing (which is correct) and then what it looks like after using setNeedsDisplayInRect to invalidate the region occupied by the button in the centre. I've explicitly set the NSWindow background colour to blue to make it more obvious :
Is it still possible to inhibit the background fill? Repainting the entire view for every update is not really an option for us, for performance reasons
I'm manually placing a subclass of NSView into the parent view using addSubview:positioned:relativeTo. The dirtyRect passed to drawRect: is wildly incorrect. Can folks attempt to reproduce and file bugs? This is awfully close to the release of Sonoma, and I feel like folks with bezier curves (or maybe other drawing code?) in their NSView subclasses are going to experience problems.
To reproduce, place a view (I'm using an NSImageView) as a subview within a view.
Then, create a subclass of NSView and draw a bezier curve in the drawRect method. Add an instance of this subclass as a subview of your original view. I'm offsetting the x value for clarity. When I build with Xcode 15 and run on Ventura or earlier, I get the correct result. Or, if I build with Xcode 14.3 and run on Sonoma I get the correct result. However, when I build in Xcode 15 and run on the RC build of Sonoma, I get a whacky result.
I get something like (origin = (x = -264, y = -146), size = (width = 480, height = 388)) for the dirtyRect in the error case, while the rect is supposed to be (origin = (x = 0, y = 0), size = (width = 48, height = 48)) (I'm basing the frame of the new view on the original image.)
Thanks!
Sonoma beta release notes mention that NSMenu was rewritten from scratch using AppKit, however, it seems like a lot of behavior was removed along the way which breaks applications. I've filed several requests using Feedback Assistant, but none of them were fixed in the 3 following betas.
FB12867496: NSMenu no longer receives keyboard events from GetEventDispatcherTarget (there is a workaround)
FB12867573: NSMenuItem custom view window is nil
FB12887219 : NSMenu performSelector highlightItem doesn't highlight menu item
FB12938907: NSMenu not properly updated when adding/removing NSMenuItem
I wonder if anyone else has experienced similar problems and can share workarounds for them:
In Monterey, when a user was at the top of a ScrollView implemented inside of a NSHostingController's view (that was itself embedded in a window with a NSToolbar), the window's toolbar background would be hidden until the user scrolled from the top.
In Ventura, this behavior is different, with the toolbar's background visible all of the time unless a traditional NSScrollView is used (which means no SwiftUI).
Is there the ability to change this behavior within SwiftUI some how now?
Filed as rdar://FB11975037
When macOS Ventura is run as a guest OS within the virtualization framework, the main menu bar items will not be displayed correctly if VZMacGraphicsDisplayConfiguration defines a large resolution.
The menu bar titles appear to be using the same color as the menu bar itself. When the Appearance is set to Light, the menu bar items are effectively invisible. When the Appearance is set to Dark, the menu bar items are drawn in what looks like a disabled state.
This only affects the menu bar item titles on the left-hand side. The date-time and menu bar icons on the right side are always displayed in the correct color.
This appears to be a regression in macOS Ventura as this issue is not present in macOS 12 running as a guest.
This bug can be easily reproduced using Apple's own Virtualization sample code titled: "Running macOS in a Virtual Machine on Apple Silicon Macs"
Steps to reproduce:
Follow the sample code instructions for building and installing a VM.bundle.
Before running 'macOSVirtualMachineSampleApp', change the VZMacGraphicsDisplayConfiguration to use:
width = 5120,
height = 2880,
ppi = 144.
Run 'macOSVirtualMachineSampleApp' and notice that the menu bar titles on the left side of the screen are not correctly drawn in the guest instance.
This has been tested on:
Host: macOS 13.1
Guest: macOS 13.x (All versions)
Hardware: MBP 14" M1 Pro 32GB/2TB
Is there anything that can be done to resolve this issue?
//
// AppDelegate.swift
// HelloCocoa
//
import Cocoa
@main
class AppDelegate: NSObject, NSApplicationDelegate {
func applicationDidFinishLaunching(_ aNotification: Notification) {
let myAlert = NSAlert()
myAlert.messageText = "Alert Title"
let messageAttributedString = NSAttributedString(string: "Hello,world", attributes: [.font : NSFont.systemFont(ofSize: 12, weight: .bold)])
let myTextField = NSTextField(labelWithAttributedString: messageAttributedString)
myTextField.allowsEditingTextAttributes = true
myTextField.isSelectable = true
myAlert.accessoryView = myTextField
myAlert.runModal()
}
func applicationWillTerminate(_ aNotification: Notification) {
// Insert code here to tear down your application
}
func applicationSupportsSecureRestorableState(_ app: NSApplication) -> Bool {
return true
}
}
The alert appears like this:
but when I clicks on the textfield, the text's color become black:
Adding foregroundColor key to attribute dictionary works for me but I really want to know why NSTextfield has such behavior
Hi,
Inside a Mac Catalyst app, I need to display a popover starting from an NSToolbarItem contained inside the app toolbar (like the Apple Maps Mac app does, see below image).
In order to do that, when I press the button I need to find the toolbar item view and use it as popover anchor.
How can I find the view or frame of an NSToolbarItem on Mac Catalyst?
A property that could help me is the NSToolbarItem "view" property (NSView), but that property has been marked has unavailable in Mac Catalyst.
Any idea?
Thank you
I am hitting major road blocks in migrating one of my Obj-C-Cocoa applications away from -[NSView (un)lockFocus] and -[NSBitmapImageRep initWithFocusedViewRect:].
In a transcript of a presentation on WWDC2018 I read:
With our changes to layer backing, there's a few patterns I want to call out that aren't going to work in macOS 10.14 anymore. If you're using NSView lockFocus and unlockFocus, or trying to access the window's graphics contents directly, there's a better way of doing that. You should just subclass NSView and implement draw rect. ...
Of course, we all implemented -[NSView drawRect:] for decades now. The big question is, how can we do incremental (additional, event driven) drawing in our views, without redrawing the whole view hierarchy. This is the use case of -(un)lockFocus, and especially when drawing of the base view is computational expensive. Wo would have thought that people use -(un)lockFocus for regular drawing of the NSView hierarchy.
I tried to get away with CALayer, only to find out after two days experimenting with it, that a sublayer can only be drawn if the (expensive) main layer has been drawn before —> dead end road.
Now I am going to implement a context dependent -[NSView drawRect:]. Based on a respective instance variable, either of the (expensive) base presentation of the view or the simple additions are drawn. Is it that what Apple meant by … just subclass NSView and implement draw rect?
From the point of view of object oriented programming, using switch() in methods to change the behaviour of the object is ugly - to say the least. Any better options?
Ugly or not, in any case, I don’t want to redraw the whole view hierarchy only for moving a crosshairs in a diagram.
My actual use case is:
This application draws into a custom diagram NSView electrochemical measurement curves which may consist of a few thousands up to millions of data points. The diagram view provides a facility for moving crosshairs and other pointing aids over the displayed curves, by dragging/rolling with the mouse or the touch pad, or by moving it point by point with the cursor keys.
Diagram generation is computational expensive and it must not occur only because the crosshairs should be moved to the next data point.
So for navigating the crosshairs (and other pointing aids), a respective method locks the focus of said view, restores the background from a cache, caches the background below the new position of the crosshairs using -[NSBitmapImageRep initWithFocusedViewRect:], draws the crosshairs and finally unlocks the focus.
All this does not work anymore since 10.14.
I'm trying to implement custom NSTextContentManager and use it with NSTextView, however it seems that NSTextView expect NSTextContentStorage all the time.
final class MyTextContentManager: NSTextContentManager {
// ...
}
It's added to layout manager, and NSTextView instance finds it properly:
let textContentManager = MyTextContentManager() textContentManager.addTextLayoutManager(textLayoutManager)
however, when I use it, I see errors at:
[MyTextContentManager textStorage]: unrecognized selector sent to instance 0x600003d84870
the textStorage property is part of NSTextStorageObserving, that is not NSTextContentManager interface.
It looks like NSTextView is not ready to work with custom NSTextContentManager. What did I miss?
Is it possible to only allow a single window instance on macOS?
WindowGroup/DocumentGroup allow the user to create multiple instances of a window. I'd like to only allow one, for an Onboarding sequence.
I've checked the Scene documentation - https://developer.apple.com/documentation/swiftui/scene, and it appears the only types conforming to the Scene protocol are WindowGroup, DocumentGroup and Settings. How can I create a single Window in a SwiftUI App?
An example use case:
struct TutorialScene: Scene {
var body: some Scene {
	// I don't want to allow multiple windows of this Scene!
	WindowGroup {
		TutorialView()
	}	
}
We have a test tool our engineers use to launch various versions of our application during development and verification. Each daily build of our application is stored on a server. As well, each push of a change generates a new build of our application that is stored on a server. These are added to a database and the developer application accesses the server via REST to find the desired version to run, retrieves a server path and launches the application. This tool is valuable in finding pushes that introduced regressions.
The developer application (Runner) is using the launchApplicationAtURL:options:configuration:error: (deprecated I know) to launch the app. Prior to Catalina, this was working great. However, as of Catalina, this process is taking a VERY long time due to the app needing to be "verified". the app seems to need to be copied to the users machine and verified. It only occurs the first launch, but as most of the time the users are running new push or daily builds, it has made the app useless. With the new remote work environment it is even worse as VPN copy can take forever.
I have switched to using NSTask with a shell script to open the executable in the bundle. If I add the developer tool (Runner) to the Developer Tools in Privacy this seems to launch the application without the need for verification. However this just seems wrong. It also provides little feedback to know when the application is up and running, which makes my user experience poor. As well many of the systems we use this tool on for verification do not have Developer Tools installed. They are VMs.
Is there a way for me to use the launchApplicationAtURL:options:configuration:error: (or the new openApplicationAtURL:configuration:completionHandler:) to launch these versions of the application without the need for the lengthy verification process? Adding our application to the Developer Tools did not seem to help.