Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Where are you coming up with these numbers 20-40 Mbps for a single video stream?

That's insane. Like I mentioned in another comment, high-quality rips of 1080p movies tend to be 5 Mbps, and that's generally with a lot more detail and motion than what your camera is filming at home.

There's really no point in 1080p streaming video above Zoom's 3.8 Mbps. Even that's already overkill which is why almost nobody uses it. While the "20 mbps" you're talking about is beyond the threshold of lossless for human perception, and 40 is beyond that.

And beyond it? There's virtually no use for 4K video in personal communications, nobody wants to see the pores in your nose. It's not about technological limitation, it's about use cases. And a 20 mbps uplink handles the real life situations of most families and roommates just fine, when you use actual real numbers and not the ones you're just making up.



If you’re can’t tell the difference between 20Mbps and 200Mbps you need glasses. This isn’t even some audiophile-tier argument about an extra 60kbps in MP3 or whatever; there are obvious compression artifacts in 20Mbps real-time-encoded CBR video, especially on any kind of remotely high-entropy scene.

> high-quality rips of 1080p movies tend to be 5 Mbps

This is nonsense. A decent 20-30GB BRRip will be at least 25-30Mbps. Also, it’s not a fair comparison because it’s not real time encoding. If you can afford to encode in 5% real-time speed you can get much better compression ratios.


> If you’re can’t tell the difference between 20Mbps and 200Mbps you need glasses.

I think you're the one imagining things here.

Like I said in an earlier comment, I record h.264 in "ultrafast" CRF 14. Every guide that exists says this is below the threshold for indistiguishable from raw footage, and my own eyes agree after extensive side-by-side comparisons. I go as low as 14 because it's overkill. But there's simply no visible difference. And it's an average of 34 Mbps for the kind of stuff I shoot, which as I said is overkill already.

200 Mbps is insanely unnecessarily high, even for real-time encoding on a crappy CPU.


> But there's simply no visible difference.

What are you filming? It sounds like you’re streaming a video of your face; how are you capturing it? If your video is blasted with shot noise from a 10mm^2 sensor through an 4mm wide lens, where “1080p” is more of a recording convention than a faithful description of the resolving power of the system, I can see how 34Mbps might look as good as whatever you see uncompressed.


>If you’re can’t tell the difference between 20Mbps and 200Mbps you need glasses.

Can you provide examples?

https://www.screenshotcomparison.com is suited for this especially.


> especially on any kind of remotely high-entropy scene

Unfortunately my job is pretty boring. Most of my video chats are low entropy.


>That's insane. Like I mentioned in another comment, high-quality rips of 1080p movies tend to be 5 Mbps, and that's generally with a lot more detail and motion than what your camera is filming at home.

I would double that, for at least movies. Not sure if you are going to see an increase in quality in smartphone style cameras.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: