Posts

Post not yet marked as solved
8 Replies
I've tried to solve rtsp to iOS by decoding to images then sending them to ios/android over websockets - works but slow and erratic, and higher bandwidth since it eliminates frame-to-frame compression. Using ffmpeg or gstreamer to transcode rtsp to HLS, and serving it using express/nodejs. This is working well and reliably with 2 camera streams and the server running on a jetson nano. If interested, I'll publish the code on bitbucket. The nano should enable some image processing as well - I want to detect fires. I'm seeing 3-ish seconds of latency. I don't need rtsp controls, but could use just the underlying rtp. I think that is used for WebRTC also, so it seems like there should be some way to use apple software to do this efficiently - just unwrap the rtp and rewrap with something(?) from webRTC. Websockets would have other advantages in my application, like pushing status changes, but udp seems to be the key to low latency. (I'm new to the apple code base and it's often unclear which of several apparent alternatives to explore. My client uses it exclusively, however, and each of the cross-platform frameworks (expo, web apps) I've tried have limitations that prevent the app features we require.) I'll probably try transcoding to webRTC next, but the nano has eliminated the pressure to get it working.