I've been recently tasked to find a live video solution for an industrial device. In my case, I want to display video from a camera to a local LCD and simultaneously allow it to be live streamed over the web. By web, I mean that the most likely location of the client is on the same LAN, but this is not guaranteed. I figured this has to be a completely solved problem by now.
Anyway, so I've tried many of the recent protocols. I was really hoping that HLS would work, because it's so simple. For example, I can use the gstreamer "hlssink" to generate the files, and basically deliver video with a one-line shell script and any webserver. But the 7 second best case latency is unacceptable. I really want 1 second or better.
I looked at MPEG-DASH: it seems equivalent to HLS. Why would I use it when all of the MPEG-DASH examples fall back on HLS?
I looked at WebRTC, but I'm too nervous the build a product around the few sample client/server code bases I can find on github. They are not fully baked, and then I'm really depending on a non-standard solution.
I looked a Flash: but of course it's not desirable to use it these days.
So the solution that works for me happens to be the oldest: Motion JPEG, where I have to give up on using a good video compression (MPEG). I get below 1 second latency, and no coding (use ffmpeg + ffserver). Luckily Internet Explorer is dead enough that I don't have to worry about its non-support of it. It works everywhere else, including Microsoft-Edge. MJPEG is not great in that the latency can be higher if the client can't keep up. I think WebRTC is likely better here.
Conclusion: here we are in 2019 and the best low latency video delivery protocol is from the mid-90s. It's nuts. I'm open to suggestions in case I've missed anything.
A fairly long time ago (3-4) years I was tasked to do something fairly similar (though running on Android as the end client). HLS was one of the better options but came at the same costs you describe here. However it was fairly easy to reduce the block size to be less to favor response vs resilience. Essentially you trade buffer size and bitrate switching quality for more precise scrolling through the video and faster start times.
I had to hack it quite severely to get fast load with fair resilience for my usecase as the devices are restricted in performance and can have fairly low bandwidth. Since you're looking at a relatively fast connection, simply reducing the chunk size should get you to the target.
As a follow up - I've spent a couple years working on a video product based on WebRTC. This either works for a PoC where you just hack things together or on a large scale where you have time and resources to fight odd bugs and work through a spectrum of logistical hoops in setting it up. So unless you plan to have a large-ish deployment with people taking care of it I would stick to HLS or other simpler protocols.
> I looked a Flash: but of course it's not desirable to use it these days.
RTMP protocol has a lot of implementations and is still widely used for the backend part of transmitting video at a low latency (i.e. from the recorder to the server).
RTSP with or without interleaved stream is another option.
DASH/HLS is a solution for worldwide CDN delivery and browser based rendering. Poorly suited for low latency.
If you need low latency and browser based rendering you need something custom.
You can also consider tunneling over WebSockets. It's a lot easier than WebRTC especially you don't need the handshaking nonsense which often require self hosting STUN and TURN servers if you don't want to rely on third parties. IIRC the performance of WebSockets is good enough for companies like Xoom.
You should probably try mixer. They rolled a low latency protocol by their own. It use websocket as a bilateral channel to allow the server push whatever it want to client directly. Achieving sub second delay (The model here looks more like webrtc instead of hls though)
I have no idea what the underlying tech is, but Steam Link can do extremely low latency on the same network and very low latency over the internet. It can also stream non-game applications, though I imagine automating steam is a nightmare.
Me and my friends have our own little streaming website and manage to get 1~2 seconds delay.... It's nothing fancy, NGINX with the RTMP plugin from which we get the streams, it only passes them trough, once we added encoding we had a noticeable delay. This is flash tech that can be run as html 5 now, but I didn't see this within your list so perhaps you haven't looked at it.
Interesting, I had tried to get an HTML5 video element to read from a gstreamer-based MPEG source, but it would not work. I'm pretty sure because gstreamer did not provide a real HTTP server, so the headers were messed up. It's odd, because oggmux did work over tcpserversink. Anyway, I will try this because I'm interested in the resulting latency.
Keep in mind that NDI is a proprietary technology from NewTek, not an open spec like SMPTE 2110/2022. That being said it does work remarkably well in my experience, provided you have a dedicated network for it.
Similar situation here, ended up with the same solution, after an initial attempt with HLS. jsmpeg (https://github.com/phoboslab/jsmpeg) made it pretty easy.
Try streaming TS packets over Websockets and decoding with FFMPEG compiled to WASM in the browser. I wrote, https://github.com/colek42/streamingDemo a couple years back, and despite the hacky code it worked really well. You could probably do much better today.
We recently completed a project with similar requirements. We ended up using rtsp from the camera and packing it up in websockets using ffmpeg. We had sub second latency. The camera gave h264 so we could just repack that.
We're giving a talk about the project on MonteVIDEO Tech meetup, though it will be in Spanish.
Well I was hoping to not have to use a commercial product. From the front page, "Ultra Low Latency WebRTC" is supported only in the Enterprise Edition. I may as well use Flash.
"8-12 seconds End-to-End Latency" for community edition.
Actually a commercial product is not necessarily a problem, but the monthly fees are. If there was a one time fee version (perhaps with limited number of clients or something), then this might work.
I used jsmpeg to live stream camera feeds from robots. There are a few others that do the same. In my case I wrote a custom go server to handle the multiplexing. It did fairly well, and was able to support something like 60 clients at a time. This was a weekend project and I don't have time to keep the reobots on line so I will leave you with some video of a early client I build. There are some other videos showing the robots off on my channel.
I also poked a round with making a real time remote desktop client that could be access via a web browser for linux. It to -- at least on local lans got very low latency video. The link for that is below too.
Edit: I should mention latency were measured in ms, not seconds, even for many clients. I am sure to scale out to 1000's of users I would have to add a bit, of latency but not by much.
Oh yeah, I saw that. I'm also hoping to be able to use the h.264 compression hardware built into the SoC we're using and it was my understanding that jsmpeg was MPEG1 only.
That being said, the ffmpeg solution is not using the hardware accelerator either, even though it does support MJPEG. But I think with work we can get a gstreamer based solution: the missing part is an equivalent of ffserver that works with gstreamer. The hardware vendors like to provide gstreamer plug-ins for their accelerators.
Also, it's weird to me that this needs a giant javascript client library. What about the HTML5 built-in video support?
If you are using mpeg1 you can just dump the packets on the line. And if you want to get fancy you can read in a HQ stream and setup a beefy server to run 3 or 4 conversions to different bandwith classes and move clients up and down as required.
My code is geared to robots -- and has not been updated recently but there is at least a example of the simpler multiplexing in go.
Their own variation of HLS. Note that except for Safari, browsers don't implement HLS directly, but rather websites do, through HLS.js etc. So you can implement whatever low latency version of HLS you want (assuming it is constructed of HTTP primitives that JS can access).
Anyway, so I've tried many of the recent protocols. I was really hoping that HLS would work, because it's so simple. For example, I can use the gstreamer "hlssink" to generate the files, and basically deliver video with a one-line shell script and any webserver. But the 7 second best case latency is unacceptable. I really want 1 second or better.
I looked at MPEG-DASH: it seems equivalent to HLS. Why would I use it when all of the MPEG-DASH examples fall back on HLS?
I looked at WebRTC, but I'm too nervous the build a product around the few sample client/server code bases I can find on github. They are not fully baked, and then I'm really depending on a non-standard solution.
I looked a Flash: but of course it's not desirable to use it these days.
So the solution that works for me happens to be the oldest: Motion JPEG, where I have to give up on using a good video compression (MPEG). I get below 1 second latency, and no coding (use ffmpeg + ffserver). Luckily Internet Explorer is dead enough that I don't have to worry about its non-support of it. It works everywhere else, including Microsoft-Edge. MJPEG is not great in that the latency can be higher if the client can't keep up. I think WebRTC is likely better here.
Conclusion: here we are in 2019 and the best low latency video delivery protocol is from the mid-90s. It's nuts. I'm open to suggestions in case I've missed anything.