Hi all,
I was wondering if anyone knew a way to change the brightness of your MacBook screen in Swift without using an overlay that changes the colours?
I want the effect of just pressing the F1 and F2 brightness controls but done without using system events/ Applescript popping up windows on the screen.
I think the UIScreen.brightness is something similar to what I want but it is not available for NSscreen. I can't figure out a way to do it with IOKit either.
Things like ddccl doesn't work as the screen is not an external monitor.
If there is a solution using Swift or terminal commands any help is much appreciated.
Thanks,
James
AppKit
RSS for tagConstruct and manage a graphical, event-driven user interface for your macOS app using AppKit.
Posts under AppKit tag
197 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
I am trying to get the list of printers using NSPrinter.printerNames, but it doesn't return any items.
static func getPrinterList() -> [(name: String, isAvailable: Bool)] {
let printerNames = NSPrinter.printerNames
return printerNames.map { name in
let isAvailable = NSPrinter(name: name) != nil
return (name: name, isAvailable: isAvailable)
}
}
The printerNames is a empty string array.
I checked the settings for printers and screens, and there is a printer device listed.
I need to set something else?
All the threads only contain system calls. The crashed thread only contains a single call to my app's code which is main.swift:12.
What could cause such a crash?
Crash report
Im building a macOS application where the UI is a blurry transparent background that is somewhat see through like what apple uses for the macOS control center and I'm using the NSVisualEffectView from AppKit to do so. My question is, does the Xcode simulator accurately portray translucent UI views? in other words: "what you see is what you get" or will the vibrancy be more pronounced when the app is run on my machine itself.
var material: NSVisualEffectView.Material = .contentBackground //blurry background to cover the whole UI
var blendUI: NSVisualEffectView.BlendingMode = .behindWindow // applying the blurry background to the UI window itself
var allowsVibrancy: Bool = true
func makeNSView(context: Context) -> NSVisualEffectView {
let blurryEffectView = NSVisualEffectView()
blurryEffectView.material = material
blurryEffectView.blendingMode = blendUI
return NSVisualEffectView()
}
func updateNSView(_ nsView: NSVisualEffectView, context: Context) { // _ added, no argument label needed
nsView.material = .contentBackground
nsView.blendingMode = blendUI
}
}
struct ContentView: View {
var body: some View {
ZStack{
EffectView(material: NSVisualEffectView.Material.contentBackground, blendUI: NSVisualEffectView.BlendingMode.behindWindow )
}
}
}
If anyone there is familiar with Cocoa, can you please help with the following issue: How to make the NSWindow resize based on the contentView? Here is a video of the problem: https://drive.google.com/file/d/19LN98xdF9OLcqRZhMJsGGSa0dgMvj_za/view?usp=sharing
Thanks!
I'm trying to make macOS VoiceOver read some in text in the parent tabview when the child tabview changes tabs. VoiceOver always reads the first text entry in the child sub tab ignoring attempts to switch where the focus is.
I've tried these things, in the example textItem is a member of the parent tabview class:
.setAccessibilityApplicationFocusedUIElement(textItem)
.setAccessibilityFocused(true)
Each sub tab is a view controller loaded from a storyboard and I've added code in viewDidAppear to set the accessibility focus. I've also tried using a notification to the parent tab view to set the accessibility focus at the end of the sub tab's viewDidAppear.
Nothing seems to work, is there way to actually change the current focused accessibility UI element programmatically? This needs to work on macOS 13 and greater.
Here is a rough layout of what I'm trying to accomplish. When the use selects "sub tab 2", I want the text "Text to read first" to be the focus and have VoiceOver read that. What really happens is VoiceOver reads the contents of the sub tab "Feature Name"
In an app I bound an NSArrayController's content array to a NSUserDefaultsController like this:
[_arrayController bind:NSContentArrayBinding toObject:_userDefaultsController withKeyPath:@"values.anArrayOfDictionaries" options:nil];
This works as expected. If I add objects to the array controller or remove them, the user defaults are updated accordingly. But if I modify the content array in any other way, this seems not to trigger KVO and the user defaults are not updated. I tried various combinations of:
[self.arrayController willChangeValueForKey:...];
...
[self.arrayController didChangeValueForKey:...];
but it did not work. So I ended up doing the following:
[_userDefaults setObject:[_arrayController content] forKey:anArrayOfDictionaries];
This works but I am pretty sure there must be a better way to get the changes I made to the array controller's content array into the user defaults.
Thanks in advance for your help.
Regards,
Marc
Hi There,
I have to achieve following scenario
Track system event on macosx for shutdown and restart and update one plist with same event via launchAgent
I have tried following code on launchAgent
class MyAgent {
init() {
let notificationCenter = NSWorkspace.shared.notificationCenter
// Register for system shutdown notification
notificationCenter.addObserver(self,
selector: #selector(handleNotification(_:)),
name: NSWorkspace.willPowerOffNotification,
object: nil)
RunLoop.current.run()
}
@objc func handleNotification(_ notification: Notification) {
var logMessage = ""
switch notification.name {
case NSWorkspace.willPowerOffNotification:
os_log("System is going to shut down at", log: log, type: .default)
updatePlistFile(event: "shut down")
let fileName = "example.txt"
let content = "shut down"
createAndWriteFile(fileName: fileName, content: content)
logMessage = "System is going to shut down at \(Date())\n"
}
}
}
loaded the agent, and tried to restart device, I can't see as it is coming to handleNotification
Same code is working fine from sample application but not from launchAgent
Is there any restriction is there for NSWorkspace, if is that so, how to track shutdown/restart event from launchAgent or LaunchDaemon
Any help will be appreciate
Time to time I see the error 'FAILED TO REGISTER PROCESS WITH CPS/CoreGraphics in WindowServer' if CGImageForProposedRect is called immediately after start:
https://developer.apple.com/documentation/appkit/nsimage/1519861-cgimageforproposedrect
_RegisterApplication(), FAILED TO REGISTER PROCESS WITH CPS/CoreGraphics in WindowServer, err=-50
This error appears only one time during start. What does this error mean?
Thank you in advance!
I have a textfield in accessory view of NSSavePanel. For user convenience there are default actions supported natively by macOS (such as pressing Enter, keyEquivalent). However this doesn't work for enter under Sonoma. Escape key works. Is enter keypress dangerous for malicious actors so it's not supported? I have workaround below but I am not confident if I am not violating sandbox (future proof).
Original code demonstrating the issue:
class ViewController: NSViewController, NSTextFieldDelegate, NSControlTextEditingDelegate {
let savePanel = NSSavePanel()
override func viewDidLoad() {
super.viewDidLoad()
let customView = NSView()
let textField = NSTextField(string: "11111111")
textField.delegate = self // to get focus using tab keypress
savePanel.accessoryView = textField
}
override func viewWillAppear() {
savePanel.runModal()
}
}
Workaround:
// variable set to true in delegate method controlTextDidEndEditing
var didUseTextFieldWithEnterPressed = false
override func performKeyEquivalent(with event: NSEvent) -> Bool {
if #unavailable(macOS 14) {
return super.performKeyEquivalent(with: event)
}
guard let panel, didUseTextFieldWithEnterPressed == true, event.type == .keyDown &&
(event.keyCode == UInt16(kVK_Return) || event.keyCode == UInt16(kVK_ANSI_KeypadEnter)) else {
return super.performKeyEquivalent(with: event)
}
return panel.performKeyEquivalent(with: event)
}
I need to get image icon of running applications in daemon.
I have found the method iconForFile.
[[NSWorkspace sharedWorkspace] iconForFile: bundlePath];
However, as far as I know, the framework AppKit is not daemon-safe.
https://developer.apple.com/library/archive/technotes/tn2083/_index.html
So, the only way which I see is to get icon file path via parsing Info.plist.
However, the icon is not defined for some system app, e.g.:
/System/Applications/Calendar.app
/System/Applications/System Settings.app
Are there any way to get icons of system application in daemon code?
Is it safe to use NSBundle in daemon code?
Thank you in advance.
I just noticed that NSCursor.currentSystemCursor is deprecated starting in macOS 15. What method should I now use to get information about the current system cursor, including its image?
Hello,
Currently my macOS application registers itself as a login item in the AppDelegate applicationDidFinishLaunching method (see code below)
However, I'm running into a problem that if the user is auto upgraded (internal 3rd party implementation) that the .pkg postinstall script runs, the last step which is launching the GUI application. Because of this, if a user unselects our app as a LoginItem, when it is relaunched, it will add itself back. I have checked the SMAppService statuses (.enabled, .notRegistered, .notFound) and discovered that when a user disables the app as a login item, the status is returned as .notFound. I am trying to find a way to detect if the user previously removed our app from login items and not register the app as a login item back, but for the first time the user opens the app the app is registered as a login item. Would checking if the status is .notRegistered work in this case for a first time install? What should i do differently?
func applicationDidFinishLaunching(_ aNotification: Notification) {
...
guard !Runtime.isDebug else {
self.logger.debug("Detected Xcode host; Skipping installation of helper components.")
return
}
self.logger.info("Setting UI login item")
if mainApp.status != .enabled { //old code, incorrect. What should go here?
do {
try mainApp.register()
} catch {
logger.error("Failed to initialize UI login item: \(error.localizedDescription)")
}
}
}
For MacOS I wrote a Terminal Command Line Tool to help me do number crunching calculations. No need for UI. This works great as-is, but there are opportunities to make the work more interesting by giving it some graphical representation like a heat map of the data. To that end I want the terminal to launch a SwiftUI code to open a window given an optional command line parameter that would display this visual representation of data in a new window. It would be nice to interact with that window visualization in some ways using SwiftUI.
I originally thought it'd just be as easy as creating a SwiftUI View file and then throwing what normally appears under @main into a launch function and then calling the launch function from my main.swift file:
import SwiftUI
struct SwiftUIView: View {
var body: some View {
Text("Hello, World!")
}
}
#Preview {
SwiftUIView()
}
func LaunchSwiftUIApp()
{
//@main
struct SwiftUIApp: App {
var body: some Scene {
WindowGroup {
SwiftUIView()
}
}
}
}
Of course this doesn't work.
I've already have the command line code just spit out various png files via command line so looking to make those visualization a little more interactive or readable/organized by coding some SwiftUI stuff around those current visualizations. Anyway.
Looking around this doesn't seem to be a thing people normally do. I'm not sure how to setup the Terminal Command Line Tool code that I wrote to optionally launch into SwiftUI code.
Here is project I am researching: https://developer.apple.com/documentation/hiddriverkit/handling_stylus_input_from_a_human_interface_device
I have a Touch screen and I want:
can control this screen, I can enable extension and control Touch screen successfully.
can pen on this screen. To do this, I need to send X,Y to StylusApp(AppKit) But I Can not send X, Y to Swift StylusApp
Eventhough I can log X,Y it by default code:
But I don't know how to send it to AppKit to Pen.
I have research about communicate Driver and Client: https://developer.apple.com/documentation/driverkit/communicating_between_a_driverkit_extension_and_a_client_app#4324306 but seem HID driver don't support inheritance from IOUserClient:
So with SKStoreReviewController now deprecated... I'm wondering what API is recommended for UIKit apps?
In the recent MacOS Sequoia Beta 3 NSTextAttachment initialized with custom data and an Image cell as attachmentCell are shown as a file icon istead of the image cell.
I am creating a NSAttributedString and showing it in NSTextView like this:
NSTextAttachment *attachment = [[NSTextAttachment alloc] initWithData:[text dataUsingEncoding:NSUTF8StringEncoding] ofType:nil];
attachment.attachmentCell = [[NSTextAttachmentCell alloc] initImageCell:img];
NSMutableAttributedString *res = [[NSMutableAttributedString alloc] initWithAttributedString:[NSAttributedString attributedStringWithAttachment:attachment]];
...
I have a bundled macOS application. This is a non-interactive application where I m performing some task on the worker thread while the main thread waits for this task to be completed. Sometimes this task can be time consuming.
I have observed that when I run the application using the bundle( like double click or open command) I can see the OS marking my application as not responding( this is evident as the app icon toggles in the dock and then it states not responding).
Although If I run the unix executable in the bundle, the app runs and I do not see the not responding status anywhere.
I wanted to understand If this is happening because my main thread is in a waiting state? If yes, what could I do to resolve it because my application logic demands the main thread to wait for the worker thread to complete its task. Is there some way to use some event loop like GCD?
Note: I cannot use the delegates(Appkit) event loop because my application will be run in non-GUI context.
When using the new ‘addSymbolEffect’ effect method on NSImageView with the ‘.rotate.byLayer’ parameter with an applicable SF Symbol 6 symbol, the resulting animation is not completely as expected, to say it mildly.
This is the code line I use:
imageView.addSymbolEffect(.rotate.byLayer, options: .repeat(.continuous), animated: true)
The correct layer rotates around the correct anchor point, but the whole image is moving up and down.
The same code with the same symbol in iOS 18 beta runs perfectly.
Does anyone know how to get this new rotate API correctly working in macOS 15 beta?
In case an Apple engineer reads this:
FB13916874 contains example projects for macOS (wobbling rotation) and iOS (perfect rotation), and a screen recording what I see in macOS 15 beta.
In one of my apps, I use NSTableViewDiffableDataSource in tandem with NSFetchedResultsController, which provides the necessary snapshots. This works great in macOS 14.5.
However in latest macOS 15 betas, NSTableViewDiffableDataSource does not call the 'cellProvider' completion handler anymore when the data of a visible cell changes. When data of a visible cell changes, the didChangeContentWith method of NSFetchedResultsController is called correctly, but applying the provided snapshot doesn’t result in calling the said 'cellProvider' completion handler. This looks a rollback to the early state of this API in 2020.
It concerns this piece of code:
dataSource = NSTableViewDiffableDataSource(tableView: tableView, cellProvider: { (tableView, tableColumn, row, item) -> NSView in
// Return a cell view with data
})
Does anyone know a solution or workaround to get animated updates of visible cells working in macOS 15 beta?
Yes, applying the snapshot without animation works, but that’s not where NSTableViewDiffableDataSource is designed for.
In case an Apple engineer reads this:
Looking at the sample code associated with FB13931189, is there anything wrongly implemented that prevents calling the 'cellProvider' method for visible cells?
Is this perhaps actually a bug of NSFetchedResultsController? I’m asking this because NSCollectionViewDiffableDataSource does have a very similar problem (FB13943853).
PS
Yes, this post looks very similar to https://developer.apple.com/forums/thread/759381#759381021, because the problem is identical except that concerns NSCollectionViewDiffableDataSource.