I am from a C/C++, Python, Javascript background and recently started learning development on MacOS using Swift.
I want simple IPC(Inter Process Communication) between two processes, let's call them server process and client process.
I could not find any simple example showing use of XPC for IPC. Let me try to put my thinking and what I am trying to build.
Suppose I have server_process.swift
// This is XPC Server
import Foundation
func run_server_loop() {
// I guess I have to use NSXPCListener e.g. XPC Service
}
func on_message(message) {
// I have received message from client_server.swift
// do whatever can be done with this message
// send reply to client_server.swift
if(message == "ping") {
send("pong")
} else {
send("command not supported")
}
}
// run loop to start listener
run_server_loop()
I would need client_server.swift
import Foundation
// This is XPC Client
func some_way_to_connect() {
// some way to connect to server process
}
// connect to server which is running on server_process.swift
client = some_way_to_connect()
// send a message to server
client.send("ping")
I put my thoughts in pseudo code.
How can I achieve such communication using XPC using Swift?
Post
Replies
Boosts
Views
Activity
In my application I am capturing window using CGWindowListCreateImage
let windowID = 12345;
let windowImage = CGWindowListCreateImage(.null, .optionIncludingWindow, CGWindowID(windowID), [.bestResolution, .boundsIgnoreFraming])
This is working nicely.
How can I capture window with cursor using this approach?
I have a CoreMediaIO based DAL plugin written in Swift which currently poll a website to get that string. Which is not a good approach.
I want to send that string to DAL plugin via Operating System supported IPC(Inter Process Communication).
But there are many ways to do IPC on MacOS like
Apple Events
Distributed Notifications in Cocoa
BSD Notifications
Transferring Raw Data With CFMessagePort
Communicating With BSD Sockets
Communicating With BSD Pipes
In my case I just want one way communication from a application to DAL Plugin.
I am new to MacOS development so not sure which approch will be efficiant and best for my case of one way comminucation from Application to DAL Plugin?
The CoreMediaIO Device Abstraction Layer (DAL) is analogous to CoreAudio’s Hardware Abstraction Layer (HAL). Just as the HAL deals with audio streams from audio hardware, the DAL handles video (and muxed) streams from video devices.
DAL Pludins resides at /Library/CoreMediaIO/Plug-Ins/DAL/
What is life cycle of these DAL Plugins?
When they get started running?
When they get stopped?
When they get paused?
Where can I see their logs?
What happens when they are not in use?
How can I see their performance if they are efficient or not?
One of the famous example of CoreMediaIO DAL Plugin is OBS Virtual Camera if someone does not know.
Note: This question should not be marked too broad. I am not asking multiple questions. It's only one question to know the life cycle of CoreMediaIO DAL Plugin.