Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: Pxy – A Go server that proxies websocket livestreams to RTMP servers (github.com/chuabingquan)
133 points by chipneverdies on April 19, 2020 | hide | past | favorite | 31 comments



Awesome to see this on the front page! I’ve been chatting with the OP for a few days and this is based on a Node.js PoC I wrote for a blog post on the state of broadcasting live from a browser[1].

This is, IMO, the simplest way to be able to go from a modern browser to an RTMP endpoint. You could try and do server-side WebRTC using a project like Pions[2], or use headless Chrome and be a peer, but both of those come with their own headaches. This just uses the MediaRecorder API to send chunks of video as it’s available via a WebSocket, then the server pipes those messages into FFmpeg via stdin. It’s lightweight enough that it even runs incredibly well on Glitch[3].

Also, if you’re curious, the videos that come out of the MediaRecorder are all over the place between browsers, but they’re all varying degrees of “barely qualifying as playable video”. Incredibly fun to play with, but you will absolutely need to encode any output from it.

[1] https://mux.com/blog/the-state-of-going-live-from-a-browser

[2] https://github.com/pion/webrtc

[3] https://glitch.com/~mmcc-next-streamr


Hey Matthew, thanks for the insights you've been providing to me thus far, I've added [1] to the references in the project's README!

Meanwhile, there's another well-explained tutorial that complements [1] along with a code walkthrough to implement something similar in Node.js.

https://github.com/fbsamples/Canvas-Streaming-Example/blob/m...


Thank you! Really appreciate the link.

I'm really interested to see where this project goes, please keep me in the loop!


Will do!


In the same genre of Go video streaming adapter servers. Here are a couple neat projects which stream from RTSP (as used by eg security cameras) to either fragmented .mp4 files over WebSocket or WebRTC:

https://github.com/deepch/RTSPtoWSMP4f

https://github.com/deepch/RTSPtoWebRTC

I'm jealous of Go's WebRTC library that makes the latter possible. I'd love to have a similar Rust crate.


Doesn't Gstreamer provide a rust-based plugin to achieve this yet? It is surely just a matter of time if not.



What kind of video sources provide output via websockets to be proxied to RTMP? OBS and other streaming video tools usually support RTMP themselves and browsers only support WebRTC out of the box as far as I know. I'm curious what kind custom browser video streaming solution you're using on the client side.


This appears to use the MediaStream Recording API (https://developer.mozilla.org/en-US/docs/Web/API/MediaStream...) to produce a WebM H.264 stream, which can be sent to the WebSocket. The server then transmuxes this to a FLV H.264 stream for RTMP output.


You get Nal h264 packets from MediaStream. Note MediaStream only supports h264 baseline on Chrome (you can request other profiles, but that's all you get). Not sure about FireFox.

https://en.wikipedia.org/wiki/Network_Abstraction_Layer


[flagged]


That's what I always assumed you got back when you create a MediaRecorder like so:

  new MediaRecorder(stream, { mimeType: 'video/webm;codecs=h264' })
Which is what this project does. Is it actually returning something else behind the scenes?


While WebM usually contains VP8 or VP9, it isn't impossible to throw an h264 stream in there


WebM is Matroška, and it can contain H.264 as far as I'm aware. Not sure precisely what browsers decide to do here, but apparently it's something.


WebM is a subset of Matroska, if something is returning h.264 inside a WebM container...that's not conforming to the WebM subset. But I guess nothing matters.


I think the inverse of this (RTMP to Websockets) would be more interesting.



There is no provision for decompressing raw encoded video frames outside of WebRTC. Yes you can use Media Extensions, but that requires boxing the video. There is also Broadway.js, but that is limited to h264 baseline.


I see you are invoking Ffmpeg via the shell. I also tried steaming to Youtube from Go and had to resort to the same solution. It's a shame there isn't a usable library for implementing a RTMP client in Go. Using Ffmpeg from the shell is very limiting because there is no way to monitor the status of the stream or to provide audio and video separately using stdin.


Couldn't you use named pipes for passing separate streams of input?


I tried doing that, but I ran into some performance problems due to streaming raw video. Instead I streamed the audio via stdin and created a HTTP MJPEG server listening on localhost and pointed ffmpeg there. I was limited to maybe 10 FPS top and the JPEG encoding overhead was also getting noticeable, but I limited the framerate to 1 per second because it was mostly static (song name and current/remaining time of the track).


Cool project! I've done similar things with Kurento acting as a WebRTC peer to proxy WebRTC --> RTMP before but this is a lot more lightweight.

One thing I'd love to see at some point is a project that compiles FFMpeg's transmuxing code to WASM such that the WebM --> FLV conversion could be done client-side. That way the server-side portion would be as simple as proxying the WebSocket to a TCP socket for the outgoing RTMP traffic.


Hello, I'm the author here, thanks for liking the project! I'm not aware of Kurento prior to your post, it seems pretty nifty, good to explore for future projects!

Getting the client to do the heavy lifting while the server acts solely as a relay is a really good idea. In my use case, however, I have a flutter client that's not capable of doing that at the moment, perhaps that's something that could be explored further in the future.


A similar thing is used by Axis security cameras but with RTSP. The frontend bits are open source here: https://github.com/AxisCommunications/media-stream-player-js


When we first started replacing our Erlang router for Heroku Private Spaces with a go-based one, the name of the POC was ‘pxy’. Very big fan of the name. Yours seems fun too. <3


Which side of this proxy is producing/consuming only FLV/Flash streams?


Hello, I'm the author here. The frontend would stream the video/audio to Pxy via WebSockets. In turn, Pxy utilises FFmpeg to convert the incoming stream to FLV and send the FLV stream to your desired external RTMP service (in this case, I'm using mux.com).


OP, could you support SRT? Listener mode would be really interesting.


Can this be used as a web front end for say a raspberry pi camera ?


this will allow you to send video to any service that supports RTMP input (Facebook / YouTube etc..) If you just want to view your Rpi Camera remotely there are better solutions.


link is 404? forgot to set the repo to public?


My bad, thanks for pointing out! The repo is now made public




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: