Hacker News new | past | comments | ask | show | jobs | submit login
Cheap HDMI capture for Linux (benjojo.co.uk)
398 points by benjojo12 on June 27, 2016 | hide | past | favorite | 70 comments



You can also use devices that are USB Video Class (UVC) compliant, thus which don't require any drivers.

libuvc can be used to capture from these devices: https://github.com/ktossell/libuvc

You can get cheap off-brand UVC USB grabbers, or you can get higher quality gear, like the AJA U-TAP:

https://www.aja.com/en/products/u-tap

Admittedly though I haven't seen anything that comes close to £45 :-)


I second the UVC driver recommendation, or anything with drivers upstream (that is, in the Linux kernel itself). If this is for your day job spending £250 is well worth it for one of the better quality brands, otherwise you'll be spending far more than that debugging reliability issues.

Depending on your project, it might be worthwhile using GStreamer instead of tying yourself to a specific driver like libuvc. There's a lot you can do with just GStreamer's command-line tools -- for some examples see the first 5 minutes of my talk[1] -- and if you need more you can drop down into C or Python, though admittedly the learning curve is fairly high.

[1]: http://www.youtube.com/watch?v=Fdn2LxxM7wA&list=SPSIUOFhnxEi...


I've personally had a great experience with the Magewell USB3 HDMI capture device, which I believe is also UVC - requires no drivers on Windows/Mac and claims to be Linux compatible. Slightly cheaper than the u-tap.

https://www.amazon.com/Magewell-HDMI-USB-Capture-Dongle/dp/B...

Can do lossless capture with extremely low latency, which is pretty nice - it's good enough that I've been able to play console video games on it directly instead of plugging into a TV.


I use the Magewell too and wrote a blog about my search https://natalian.org/2015/03/13/HDMI_in/


It's still about $150 more than just swallowing your pride and using Hauppauge's Windows drivers.


there's a reason the Starts with "cheap".

all solutions so far cost well over what most people paid for their computers or TVs.

there was pro composite capture under 200 for a long time. and cobsumer ones for 30ish. and thats dealing with analog. hdmi without DRM should produce even cheapers devices, but yeah, everyone has to embrace DRM to name it hdmi


The UVC one mentioned looks to be in the ~$400 USD price range, which is a far cry from the $800 stuff the article author looked at, but it's still more expensive than the <$100 solution presented.

Was that $300~$400 worth the time needed to reverse engineer the cheaper solution? It's all about tradeoffs really. At least it's open source so other people can use it now without the time involved. Also if you need to read in from 5 ~ 10 cameras, the cost can grow very quickly.


If you want to save cash you can try your luck on AliExpress.

Here's a device that looks a lot like the Magewell USB 3 UVC HDMI grabber for $118 (with free shipping):

http://www.aliexpress.com/item/Free-Shipping-XI100D-UVC-USB3...


I just bought that device 2 weeks ago... and tried it, but so far haven't gotten any video out of it under Linux or Windows.


They certainly seem a bit confused about what kind of USB 3 connector they are meant to put on their box!


It's not HDMI and DRM increasing the costs. HDMI is a far higher bandwidth signal than composite/SVideo, as is USB3. So you have to design a more complex chip and PCB.

These are slightly more niche products too, so you don't get the economy of scale you would with something like an ARM SoC.

Also these UVC grabber deal in uncompressed video, so they are more expensive but produce a far higher quality capture.

The £45 Cat5 grabber is performing some pretty heavy compression to fit the HD video signal down a 100 Mbit Ethernet connection.


> hdmi without DRM should produce even cheapers devices, but yeah, everyone has to embrace DRM to name it hdmi

Not really. HDMI capture devices actually can't "embrace" DRM; no (legitimate) HDMI capture device is permitted to decode HDCP at all since that would defeat the point of HDCP.

HDMI capture devices are expensive because it really is difficult-- 1080p is a lot of bytes per second to decode, reencode, and record to permanent storage.


The output image uses Rec. 709 R’G’B’ components, where the darkest value is 16 and the brightest is 235, which is why the colors look washed out. It would be fair to convert back to the original color space for the comparison.

https://en.wikipedia.org/wiki/Rec._709


Cool hack, but the part about the cost of off-the-shelf solutions is wildly wrong - Blackmagic's HDMI capture solutions can be purchased for less than 145usd for pci-e and 200usd for usb3 and they have free drivers&sdk for linux.


Blackmagic's Linux drivers are gratis, but not libre. And of course they aren't upstream. This means that they are only guaranteed to work if you use exactly the same kernel that Blackmagic supports -- we've had all sorts of problems with them: the cards lock up (requiring a reboot before they will work again); the whole system locks up; the drivers were randomly overwriting other processes' memory (!); etc.

However our reliability requirements may be different than yours & the OP's, as we (at http://stb-tester.com ) use video capture for automated UI testing so we need rock-solid reliability.

I concur with the other commenter that devices with an upstream driver --such as the UVC driver-- are the way to go.


The reason why Blackmagic hardware is cheaper is that it's rubbish. Bad hardware, bad firmware, bad drivers. Everyone I know that's ever owned Blackmagic (myself included) has had a bad time. It's not worth buying from them just to save a hundred dollars.


Yup. but not as bad a RED hardware.

I've personally seen 4 red rockets die. considering the price at the time, for what was basically a glorified jpeg2000 decoder, it was bollocks.


Indeed, Blackmagic stuff is junk (although I am biased)


Could you link to examples of this? I am still looking for sane HDMI capture devices for the project that spawned this, I just could not find any decent proof that these drivers existed.


I coded the Blackmagic support for OBS Studio. It was only slightly nightmarish.

https://github.com/jp9000/obs-studio/tree/master/plugins/dec...


Thanks for creating that. I use OBS so this sounds perfect.

Could you recommend a Blackmagic product supported by your plugin that can capture from HDMI (cheaper the better, I plan to use it for hobbist-level capture of desktop/games)?


I'm not who you responded to, but the DeckLink Mini Recorder (towards the bottom of https://www.blackmagicdesign.com/products/decklink/models) is a very capable PCI-e HDMI and SDI capture card for around £100/$140.

Apart from possibly needing an update to the kernel package, all of the DeckLink devices are supported by the same SDK, with only some small variations (like giving the user the choice of selecting which input connector to use or similar).


Awesome!

I used the Intensity Shuttle for USB 3.0 on a Macbook (OS X) to develop the plugin. It costs $190 and works pretty well, but can't capture 1080p @ 60fps (only 720p/1080i). I only got it because I couldn't install a PCI-e card on my laptop ;)

If you need 1080p @ 60fps, the Intensity Pro 4K is also $190 and is a PCI-e card. Otherwise, the Mini Recorder should work just fine.


A not cheap, but fully open source (hardware and firmware) solution also exists. https://github.com/timvideos/HDMI2USB


What about HDMI DRM? Does it break it (illegal) or fail when it is used?


Given the vast number of devices on Amazon that strip HDCP, and the fact that the master key was leaked many years ago, I doubt anyone that could act on this "illegal" activity cares anymore.


It's my understanding that the counter measures being completely broken (or not) doesn't matter.


De jure it does not. De facto, well... at the point where you're selling gizmos on Amazon and nothing's happening to the sellers or the purchasers, it probably doesn't matter.


This device happily decodes HDCP. I figured the manufacturer does not care


>> This device happily decodes HDCP.

If used as intended will it allow an HDCP source to show on a non-HDCP display? Because stupid AppleTV won't allow "protected" content to display on mt main TV because it doesn't support HDCP, and if you buy something on iTunes you don't find out until you've already purchased it. DRM is driving me away from services like that and I'd rather just stop using them than buy another TV.


You're probably better off buying an HDMI-in/out device like [1]. It's a "splitter", but you can just plug the HDMI-in in one side and one HDMI-out in the other, and since it doesn't enforce HDCP on the output, it works as a stripper.

[1] https://www.amazon.com/ViewHD-Powered-Splitter-1080P-Model/d...


Hurray! Therefore this will be quite handy for mythtv, eventually. Will require some messing around to integrate it into the myth system.


That is awesome, and would be something I would try and do right now, if I hadn't got rid of my cable subscription in favour of over-the-air Freeview (I'm in the UK).


> (illegal)

[citation needed]


This is effectively a lossy capture though, since the HDMI->IP converter is re-encoding the video to MJPG before transmitting it.

Hard to beat for the price, though


"These units are not perfect for quality, however they are not bad at all, Here is a screenshot taken on my laptop..."

I am confused by this ... HDMI is a digital interconnect and deals in digital I/O ... in my mind, if you succeed in capturing that HDMI, the resulting stream should be identical ... why would there be a quality (or any other) difference ?


Because they compress the video before sending it over IP. Nothing fancy, just M-JPEG which is basically the same thing as taking a screenshot of the video every nth of a second and saving it as a high-quality JPEG (probably around 95% quality rating).

The screenshot in the video suggests 1000fps but that figure is highly dubious. It is much more likely to be 30 or 60fps on a cheap HDMI->IP converter like this.


1000fps, what the hell? HDMI only supports up to 60fps (which makes sense considering the intended use is sending video to a display device like monitor/tv/projector).


You nearly always want the capture device to compress because it's otherwise very hard to handle. 1080p/60/8 HDMI is 4.46Gbps; just about fits in a USB3 or PCIe3 data link, but exceeds most SSDs' write bandwidths.


? Most all half-decent SSDs can saturate a 6Gbps SATA link either direction; that's why they're moving to PCIe.

Also, 1920x1080x60x24 is just under 3Gbps; where are you getting 4.46?

And even if you wanted to record compressed video, you'd almost certainly get better quality feeding it through other HW encoders if you didn't want to spend CPU time on it.


Recompression via a lossy algorithm probably


I wish there was something that worked on Android for HDMI input. Would be great to be able to repurpose tablets as external/field monitors. Some UVC dongles work, but I haven't seen any low cost 1080p ones


theres a whole lot of camera monitors for pro video that would die that day.


How would one go about compressing the captured stream?

I read it can be containerized as an MKV, but then would you re-encode as H.264, or is there a better codec?

I assume there is a quality vs time to encode trade-off?

Also, would this be a component in a viable DIY HDMI matrix switch?

I see this 4 input / 8 output 4K HDMI matrix switch for only $169 on Amazon here: http://amzn.to/28Ykyor

So, cost-wise not viable, but I like the idea of having more programmatic control of the switch.



Appreciate the link.

I had not considered using a raspberry pi on the receiving end.

Any thoughts on using multiple Pis to receive the same stream?

I would like to take the output from an HDMI source and display it on multiple displays throughout my house.


Use any codec scheme you like. ffmpeg and mencode are the usual suspects, or Handbrake if you like GUIs to help you along.

Realtime encoding is within the reach of some schemes on some processors.


Realtime encoding is within the reach of some schemes on some processors.

Can you elaborate on that?

What sort of setup (hw/sw) would one need to make this viable (realtime encoding from the stream)?


Intel GPUs + VAAPI (https://01.org/linuxgraphics/community/vaapi) can easily do H.264 encoding in real time.

P.S. See also https://trac.ffmpeg.org/wiki/HWAccelIntro


Are the gamer HDMI capture devices expensive because they have a low latency? (Besides the verticality of the market.)


I believe they are also doing the encoding into a compressed format.

The method in the link gives you the raw HDMI stream, which you can containerize to play in VLC and the like, but you wind up with some pretty large files.

You could potentially be capturing a raw 4 to 18 Gbps stream.


The method in the link involves capturing whatever the HDMI extenders use as their compression format, which usually is MJPEG or some (I-frame only) variant of H.264.

[Edit: Added MJPEG, as this seems to be case for the device in the article]


Any capture devices you know that are decent? I've been toying with getting one to do demos.


I use the AVerMedia Game Capture HD 2 to capture HDMI from Xbox One and PS4 and I've been very happy with it.

Unlike a lot of such devices this one is stand-alone, I don't plug it into a PC, I just use it as an HDMI pass-through, plug a USB3 external hard-drive into it (it also has a bay for internal 2.5 SATA HDD, but I use an external drive so I can just pull that out of the capture device, plug it into a computer to get the captured mp4 files over quickly and easily for editing).

The stand-aloneness of this device is pretty great because it means you don't need to muck around with drivers on your other computers, though YMMV depending upon specific usage, a lot of people like to do live streams directly to places like twitch and this device isn't really well suited for that sort of thing.

Also, this device (AFAIK) doesn't do HDCP pass through. Game systems generally don't enable HDCP during gaming (but do during, say, blu-ray playback). So that certainly makes it less flexible than solutions which will allow HDCP capture.


I have an Hauppauge HD PVR2 "Gaming Edition", which now costs about $135 and I can recommend. The only downside is there's a long "boot time" after you first tell it to record video (don't expect to just hit a hotkey and capture video 2 seconds later; give it more like 10-15.)

For what I'm using it for, latency doesn't matter much. I'd estimate it has about 1.5-2 seconds of latency, but I've never actually measured it.


Off-Topic but somehow related: I was thinking about to use my iDevices as generic HDMI-Displays.

The afaik best app for doing this is Duet Display [1], if you want to use Mac/Win. But I want to use it for a Raspberry Pi for example. Duet display streams the display information to the own application via ethernet over lightning port.

HDMI itself is also some network protocol (dangerous superficial knowledge). Theoretically it should be possible to build a small device, which translates hdmi video/audio to the same kind of network via lightning port as duet and display?

What do you think? Do you see any pitfalls there, which I don't see?

[1] http://www.duetdisplay.com/


How is the latency with something like this? I would assume it's pretty much real time when dealing with the raw data, encoding could add some latency. Am I correct?


Most of the latency comes from internal buffering inside ffmpeg or vlc. Most go away if you tune them down.


This quite interesting, I've been toying with building a capture box myself for presentation/gaming streaming and recording. It takes two inputs: HDMI-in from monitor, and HDMI/Other Digital for the camera, and one output: HDMI out back to the monitor. It can be configured to record to its internal SSD or stream (to popular streaming sources). I run a meetup and would love an easy way to record talks, there isn't much out on the market currently which is unfortunate. I'm really happy to see stuff like this surface in the OSS world.


That's a brilliant solution actually. Nicely out of the box.


What is the purpose of something like this?


Capture of HDMI Video via Network, driverless, on any OS that is able to accept a Video Stream.


Does this work with encrypted video? The sample image came from an unencrypted source, but it seems like a big violation of the HDMI spec if these IP extenders strip the HDCP off before blasting the data over an IP network.

This could be a rather major caveat to this solution if it only works on unencrypted sources.


I love reading these. I always wonder if the original hardware/software/firmware engineer reads this and goes:

  A. Well, Duh!
  B. Well played reverse engineering Sir!
  C. Oh no, I need to do better next time!


    D. I hope management doesn't see this...


This is awesome. I was just thinking about something that could be even more awesome: if someone produced an HDMI to MIPI CSI adapter - that would allow e.g. a RasPI to have an "HDMI in" port.


What are the minimum hardware specifications for reading the ip data? Would it for example work with an raspberry Pi or is the arm cpu and the 100 mbit network adapter to slow?


Can it capture a 5.1 AC3 audio stream along with the video?


I don't know the details, but this is from the article:

"[...] by default it will output matroska, aka mkv with both video and audio so that it can be easily interfaced with (no compression happens in this stage, just containerisation).


original source of research 25. January 2014: https://blog.danman.eu/reverse-engineering-lenkeng-hdmi-over...

new version producing ordinary h264, can stream directly to YT/twitch

https://blog.danman.eu/new-version-of-lenkeng-hdmi-over-ip-e...




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: