How to consume video from an RTSP service?

Hi,


It seems like it's pretty easy to consume HTTP Live Streaming content in an iOS app. Unfortunately, I need to consume media from an RTSP server. It seems to me that this is a very similar thing, and that all of the underpinnings for doing it ought to be present in iOS, but I'm having a devil of a time figuring out how to make it work without doing a lot of programming.


For starters, I know that there are web-based services that can consume an RTSP stream and rebroadcast it as an HTTP Live Stream that can be easily consumed by the media players in iOS. This won't work for me because my application needs to function in an environment where there is no internet access (it's on a private Wifi network where the only other thing on the network is the device that is serving the RTSP stream).


Having read everything I can get my hands on and exploring third-party and open-source solutions, I've compiled the following list of ideas:


1. Using an iOS build of the open-source ffmpeg library, which supports RTSP, I've come up with a test app that can receive the RTSP packets, decode them, create UIImages out of the frames, and display those frames on-screen. This provides a crude player, but performance is poor, most likely because ffmpeg can't take advantage of any hardware acceleration. It also doesn't provide me with any way to integrate the video stream into AVFoundation, so I'm on my own as far as saving the stream to a file, transcoding it, etc.


2. I know that the AVURLAsset class doesn't directly support the RTSP scheme. Since I have access to the undecoded RTSP packets via ffmpeg, I've thought it should be possible to implement RTSP support myself via a custom NSURLProtocol, essentially fooling AVFoundation into reading those packets as if they originated in a file. I'm not sure if this would work, since the raw packets coming from the RTSP server might lack the headers that would otherwise be present in data being read from a file. I'm not even sure if AVFoundation would recognize my custom protocol.


3. If a protocol doesn't work, I've considered that I might be able to implement my own local HTTP Live Streaming server that converts the RTSP packets into an HTTP stream that the media players can read. This sounds like a terribly convoluted solution to the problem, at best, and very difficult at worst.


4. Going back to solution (1), if I could speed up the decoding by using some iOS CoreVideo function instead of ffmpeg, this solution might be okay. However, I can't find any documentation for CoreVideo on iOS (Apple only documents it for OS X).


5. I'm certainly willing to license a third-party solution if it works well and provides good performance. Unfortunately, everything I've found so far is pretty crummy and mostly just leverages ffmpeg and/or VLC. What is most disappointing to me is that nobody seems to be able or willing to provide a solution that neatly integrates with AVFoundation. I really want to make my RTSP stream available as an AVAsset so I can use it with AVFoundation players and other classes -- I don't want to build an app that relies on custom third-party code for everything.


Any ideas, tips, advice would be greatly appreciated.


Thanks,

Frank

Replies

So why (or, more importantly, how) is your whatever-it-is producing RTSP instead of an HTTP Live Streaming compatible video stream?

The device has on-board firmware that is providing the RTSP server. It also acts as its own wifi hotspot. As to why, I have no idea, although I suspect that the device was probably originally intended to work with an RTSP client on the internet, but was adapted for this purpose.

My searching concerning iOS and RTSP turn up at least one github project "ffmpeg avplayer for iOS tvOS" claiming that the Apple frameworks do the hard work in the recent iOS versions. (Link not included to avoid moderation. The quoted text is the title of the github project.)


It seems like your list of options is either out of date, or trying to do things the hard way.

I'll download that and review it. Thanks.

Hello,

If you can make IP Camera connection with rtsp protocol, can you help me? Unfortunately, everything I find uses FFmpeg and / or VLC. And how did you use the example of the github project "ffmpeg avplayer for iOS tvOS". Thank you.

AVPlayer basically doesn't support RSTP at anyway.


Implementing a local server which does RSTP->HLS conversion is not a bad idea (but It cost time and less performance to do, comparing to HLS)

Some big companies have used this approach for some other thing HLS->HLS.


Some other open sources use FFmpeg to implement a player which supports but that just stops at the idea. Most instances are very unstable so far. Performance of FFmpeg on iOS/tvOS is also a big question to me.


The other thing is to forget about RSTP player on iOS/tvOS, your streaming back-end should support both types of RTSP and HLS.

I've tried to solve rtsp to iOS by

  1. decoding to images then sending them to ios/android over websockets - works but slow and erratic, and higher bandwidth since it eliminates frame-to-frame compression.
  2. Using ffmpeg or gstreamer to transcode rtsp to HLS, and serving it using express/nodejs. This is working well and reliably with 2 camera streams and the server running on a jetson nano. If interested, I'll publish the code on bitbucket. The nano should enable some image processing as well - I want to detect fires. I'm seeing 3-ish seconds of latency.

I don't need rtsp controls, but could use just the underlying rtp. I think that is used for WebRTC also, so it seems like there should be some way to use apple software to do this efficiently - just unwrap the rtp and rewrap with something(?) from webRTC. Websockets would have other advantages in my application, like pushing status changes, but udp seems to be the key to low latency.

(I'm new to the apple code base and it's often unclear which of several apparent alternatives to explore. My client uses it exclusively, however, and each of the cross-platform frameworks (expo, web apps) I've tried have limitations that prevent the app features we require.)

I'll probably try transcoding to webRTC next, but the nano has eliminated the pressure to get it working.

skip007, can you send link to the bitbucket repo?