Hi All, I am trying to write an NWProtocolFramerImplementation that will run after Websockets. I would like to achieve two goals with this
Handle the application-layer authentication handshake in-protocol so my external application code can ignore it
Automatically send pings periodically so my application can ignore keepalive
I am running into trouble because the NWProtocolWebsocket protocol parses websocket metadata into NWMessage's and I don't see how to handle this at the NWProtocolFramerImplementation level
Here's what I have (see comments for questions)
class CoolProtocol: NWProtocolFramerImplementation {
static let label = "Cool"
private var tempStatusCode: Int?
required init(framer: NWProtocolFramer.Instance) {}
static let definition = NWProtocolFramer.Definition(implementation: CoolProtocol.self)
func start(framer: NWProtocolFramer.Instance) -> NWProtocolFramer.StartResult { return .willMarkReady }
func wakeup(framer: NWProtocolFramer.Instance) { }
func stop(framer: NWProtocolFramer.Instance) -> Bool { return true }
func cleanup(framer: NWProtocolFramer.Instance) { }
func handleOutput(framer: NWProtocolFramer.Instance, message: NWProtocolFramer.Message, messageLength: Int, isComplete: Bool) {
// How to write a "Message" onto the next protocol handler. I don't want to just write plain data.
// How to tell the websocket protocol framer that it's a ping/pong/text/binary...
}
func handleInput(framer: NWProtocolFramer.Instance) -> Int {
// How to handle getting the input from websockets in a message format? I don't want to just get "Data" I would like to know if that data is
// a ping, pong, text, binary, ...
}
}
If I implementing this protocol at the application layer, here's how I would send websocket messages
class Client {
...
func send(string: String) async throws {
guard let data = string.data(using: .utf8) else {
return
}
let metadata = NWProtocolWebSocket.Metadata(opcode: .text)
let context = NWConnection.ContentContext(
identifier: "textContext",
metadata: [metadata]
)
self.connection.send(
content: data,
contentContext: context,
isComplete: true,
completion: .contentProcessed({ [weak self] error in
...
})
)
}
}
You see at the application layer I have access to this context object and can access NWProtocolMetadata on the input and output side, but in NWProtocolFramer.Instance I only see
final func writeOutput(data: Data)
which doesn't seem to include context anywhere.
Is this possible? If not how would you recommend I handle this? I know I could re-write the entire Websocket protocol framer, but it feels like I shouldn't have to if framers are supposed to be able to stack.
Post
Replies
Boosts
Views
Activity
I have a an application that incorporates a RTCP client to communicate with a local networked device. The device's RTCP implementation requires that the local RTCP port be 6970 even for the initial outbound connection. I have an implementation that is working just fine on macOS and iOS, but I when I port this to visionOS, I see that the local connection is ignoring my .requiredLocalEndpoint specification and choosing a random port (I can see this is happening in Wireshark via the VisionOS simulator.)
let remoteRtcpEndpoint = NWEndpoint.hostPort(host: NWEndpoint.Host(remoteRTCPAddress), port: NWEndpoint.Port(rawValue: remoteRTCPPort)!)
let rtcpParameters = NWParameters.udp
let localEndpoint = NWEndpoint.hostPort(host: NWEndpoint.Host("0.0.0.0"), port: NWEndpoint.Port(rawValue: localRTPPort)!)
Self.logger.info("Starting rtcp with local port \(localRTPPort), remote address \(remoteRTCPAddress), endpoint \(String(describing: localEndpoint))")
rtcpParameters.requiredLocalEndpoint = localEndpoint
rtcpParameters.allowLocalEndpointReuse = true
remoteRtcpConnection = NWConnection(to: remoteRtcpEndpoint, using: rtcpParameters)
remoteRtcpConnection.start(queue: .global())
And here's the code that sends the command.
remoteRtcpConnection.send(content: RtcpPacket.vdly(delayMs: HUGE_FIXED_VDLY_MS).packet(), completion: .contentProcessed({ error in
if let error = error {
Self.logger.warning("Error sending VDLY packet \(error)")
continuation.resume(throwing: error)
} else {
Self.logger.debug("VDLY Sent \(self.HUGE_FIXED_VDLY_MS)")
continuation.resume(returning: ())
}
}))
If this is a known limitation of the Network framework, should I revert to BSD sockets or something? How can I work around this? Additionally, I don't have a VisionOS device, so I am using the simulator, could this be a problem with the simulator's network bridge?
I have a scroll view that scrolls horizontally and one of my users is asking that it respond to their scroll wheel without them having to use the shift key. Is there some way to do this natively? If not, how can I listen for the scroll wheel events in swiftUI and make my scroll wheel scroll to respond to them?
I am able to create a UIImage from webP data using UIImage(data: data) on iOS and iPadOS. When I try to do this same thing on watchOS 10, it fails. Is there a workaround to displaying webp images on watch os if this isn't expected to work?