For MacOS I wrote a Terminal Command Line Tool to help me do number crunching calculations. No need for UI. This works great as-is, but there are opportunities to make the work more interesting by giving it some graphical representation like a heat map of the data. To that end I want the terminal to launch a SwiftUI code to open a window given an optional command line parameter that would display this visual representation of data in a new window. It would be nice to interact with that window visualization in some ways using SwiftUI.
I originally thought it'd just be as easy as creating a SwiftUI View file and then throwing what normally appears under @main into a launch function and then calling the launch function from my main.swift file:
import SwiftUI
struct SwiftUIView: View {
var body: some View {
Text("Hello, World!")
}
}
#Preview {
SwiftUIView()
}
func LaunchSwiftUIApp()
{
//@main
struct SwiftUIApp: App {
var body: some Scene {
WindowGroup {
SwiftUIView()
}
}
}
}
Of course this doesn't work.
I've already have the command line code just spit out various png files via command line so looking to make those visualization a little more interactive or readable/organized by coding some SwiftUI stuff around those current visualizations. Anyway.
Looking around this doesn't seem to be a thing people normally do. I'm not sure how to setup the Terminal Command Line Tool code that I wrote to optionally launch into SwiftUI code.
Looking around this doesn't seem to be a thing people normally do.
Right. It just doesn’t fit well into the Mac app lifecycle. Ignoring the command-line tool aspect of this, it’s possible to start a Mac app from Terminal but the end result behaves weirdly. For example, if you quit Terminal that closes the window running the shell running your app, and your app ends up being terminated.
A more standard model is to have an app and a tool. The tool does tool-y things and then, if the user requests UI work, it passes that request to the app via some sort of IPC mechanism. A good example of this is the BBEdit app and its corresponding bbedit
command-line tool.
The easiest way to implement this is to create an app with a custom document type. Then have your tool create such a document and use NSWorkspace
to open it in the app.
Notwithstanding the above, it is possible to create a single executable that functions as an app and a tool, but it’s tricky to get right:
-
You have to put the executable in an app structure, so that when the UI frameworks come up they know what to do.
-
Using the SwiftUI app lifecycle is probably not going to work. Rather, stick with the AppKit lifecycle. You can still use SwiftUI for all your views.
-
And don’t use
@main
on your app delegate but instead have a main function that, when it wants to run the app, invokesNSApplicationMain
. -
Use a command-line argument to tell the executable to run in tool mode.
-
When running in tool mode, don’t touch any UI frameworks because they can cause your process to connect to the window server, which causes it to show up in the Dock O-:
This complexity, and the resulting app lifecycle weirdness, explains why folks typically use the model I outlined above.
Share and Enjoy
—
Quinn “The Eskimo!” @ Developer Technical Support @ Apple
let myEmail = "eskimo" + "1" + "@" + "apple.com"