Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think a lot of people might take issue with the statement “plenty high”. This type of statement comes across as similar to when we constantly heard ISPs decrying “25 down is plenty! it’s only pirates who need more than this!” and this was only a few years ago when ISPs were spouting this insanity.

Using your numbers — and we’ll pretend we haven’t already sailed past 1080p as a standard — if we expect better than garbage zoom tier quality, at 20-40 for one stream, most of our isp connections are currently woefully slow for uploads.

Many many many people have others who live in their households, whether they’re roommates, partners, or kids. So if you have 3 people on video chat, suddenly if we want high quality streams, that need goes from 20 and shoots up to 60+. If you have 3 kids on a video chat or streaming for school or play and a parent or two working from home, suddenly you’re up to 100+ upload.

And that’s ignoring that 1080p is becoming old world. I haven’t looked recently, but when buying my last monitor set a year ago, 4k seemed to be by far the most commonly available.

this whole x amount of bandwidth “is plenty” almost always seems to ignore a) real life situations of most families/roommates and b) forward moving technologies.



Where are you coming up with these numbers 20-40 Mbps for a single video stream?

That's insane. Like I mentioned in another comment, high-quality rips of 1080p movies tend to be 5 Mbps, and that's generally with a lot more detail and motion than what your camera is filming at home.

There's really no point in 1080p streaming video above Zoom's 3.8 Mbps. Even that's already overkill which is why almost nobody uses it. While the "20 mbps" you're talking about is beyond the threshold of lossless for human perception, and 40 is beyond that.

And beyond it? There's virtually no use for 4K video in personal communications, nobody wants to see the pores in your nose. It's not about technological limitation, it's about use cases. And a 20 mbps uplink handles the real life situations of most families and roommates just fine, when you use actual real numbers and not the ones you're just making up.


If you’re can’t tell the difference between 20Mbps and 200Mbps you need glasses. This isn’t even some audiophile-tier argument about an extra 60kbps in MP3 or whatever; there are obvious compression artifacts in 20Mbps real-time-encoded CBR video, especially on any kind of remotely high-entropy scene.

> high-quality rips of 1080p movies tend to be 5 Mbps

This is nonsense. A decent 20-30GB BRRip will be at least 25-30Mbps. Also, it’s not a fair comparison because it’s not real time encoding. If you can afford to encode in 5% real-time speed you can get much better compression ratios.


> If you’re can’t tell the difference between 20Mbps and 200Mbps you need glasses.

I think you're the one imagining things here.

Like I said in an earlier comment, I record h.264 in "ultrafast" CRF 14. Every guide that exists says this is below the threshold for indistiguishable from raw footage, and my own eyes agree after extensive side-by-side comparisons. I go as low as 14 because it's overkill. But there's simply no visible difference. And it's an average of 34 Mbps for the kind of stuff I shoot, which as I said is overkill already.

200 Mbps is insanely unnecessarily high, even for real-time encoding on a crappy CPU.


> But there's simply no visible difference.

What are you filming? It sounds like you’re streaming a video of your face; how are you capturing it? If your video is blasted with shot noise from a 10mm^2 sensor through an 4mm wide lens, where “1080p” is more of a recording convention than a faithful description of the resolving power of the system, I can see how 34Mbps might look as good as whatever you see uncompressed.


>If you’re can’t tell the difference between 20Mbps and 200Mbps you need glasses.

Can you provide examples?

https://www.screenshotcomparison.com is suited for this especially.


> especially on any kind of remotely high-entropy scene

Unfortunately my job is pretty boring. Most of my video chats are low entropy.


>That's insane. Like I mentioned in another comment, high-quality rips of 1080p movies tend to be 5 Mbps, and that's generally with a lot more detail and motion than what your camera is filming at home.

I would double that, for at least movies. Not sure if you are going to see an increase in quality in smartphone style cameras.


The family that can equip their kids with production-grade 4k video kit can probably afford 100Gbit business internet service to their house tbh.

4k UHD Netflix stream is ~20Mbps. 1080p is usually about 5-6Mbps, and 99% of people say that it looks great and is all they want.

4k UHD is not needed for effective video chats for most business and personal use. And they wouldn't even need the same as a stream as it's a relatively static image and thus easy to compress.

Your image is typically a little square on the screen too (not the full display size). It is highly unlikely consumers will ever shell out for the camera equipment to create high quality images that need such bandwidth, even if such bandwidth becomes common.

Moore's law will maybe push this all forward in time, but what you describe is a total exaggeration of the current situation.


None of the video streaming software is set up for that, because nobody's internet can upload in that. The best I can do is a 1080p SLR ($350, once) + clicking the HD button in zoom, and most of that is being carried by the better optical system. All the low frame rates, micro stutters and so on still exist, adding to zoom fatigue.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: