Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm not convinced.

Take current 100/20 Mbps speeds. An average Zoom call uses 0.6 Mbps upstream, while a super-HD 1080p one uses 3.8 Mbps up. (And virtually nobody videoconferences from home in 1080p anyways, who wants coworkers to see your skin blemishes in maximum detail?!)

So a connection or 20 Mbps upload supports 33 users in theory, or 5 at super-HD. Even allowing half that in practice... seems fine to me.

The 100 Mbps download is necessary when you've got someone watching an 8K VR video on their headset, and a few 1080p movie streams going as well, which is more reasonable in a family setting.

But most people just don't have any use for upload speeds anywhere near as fast as download. Or at least certainly not until we start doing 8K 3D video calls in front of massive 3D displays, which isn't anytime soon...

[1] https://support.zoom.us/hc/en-us/articles/201362023-System-r...



>But most people just don't have any use for upload speeds anywhere near as fast as download. Or at least certainly not until we start doing 8K 3D video calls in front of massive 3D displays, which isn't anytime soon...

Most people don't have any use for upload speeds because very few have upload speeds that can be used in an interesting way.

If lots of people had lots of upload, you might start to see widespread commercial products for home-hosting what currently goes in cloud providers and social media sites.


Hard to imagine what in the home would need 1 GBPS of bandwidth. What could possibly generate that much data?


I have symmetric 1gbps. It's not about what generates that much data. It's about things that become feasible.

I remember when a decade ago I occasionally needed to share large VM images with a customer. I'd set up my notebook to upload it over the weekend and hope the upload wouldn't abort. Occasionally remoting in from home to check it was still uploading.

Now a 50Gb VM image is absolutely nothing to me. I can let people pull that from me during a coffee break, instead of scheduling it around a weekend.

It's not that sharing a 50Gb image was impossible before. It's that I can do it so quickly now, that I can do things like these much more liberally and don't have to plan around them as much.


This is a relatively niche use.

It’s unrealistic to expect mass-market household products to be engineered to support a small number of users with bespoke requirements.


mass market is engineered this way with fiber and its great.

All households benefit if you think about video uploads, backups, etc


1 GBPS is a lot of video though. Something like 10 streams of the highest quality and like 40-100 of really good quality.


It means really fast docker push, VM uploads, backups, etc.

It also means I can keep a ubuntu mirror, private docker repo, possibly IPFS node in the future, backup server, all at home.

I also never have to worry about my work/uploads saturating connection and interfering with wife/kids connections.

Growing up with 20KB/s I'd take the fastest connection I can, if I could update to 10Gb/s and more I would :)


My father routinely had to transfer multiple gigabytes of CAD model data to colleagues and clients back when he was was working in aerospace as a private contractor. He lived in a rural area and had to get by with satellite internet at the time. The real killer for him was the data caps and throttling that the satellite provider would impose after transmitting some ridiculously small amount like 25GB.

Not every worker in the world goes to work in a shared office space with enterprise class Comcast or fiber accounts. Modern businesses of many types can benefit or even must have fast uploads.


A self hosted website with high traffic? Maybe you’re hosting high quality video, or a high tick multiplayer game server? Perhaps those specific examples are niche use cases, but the idea is that they‘ll stop being so niche if symmetrical upload/download becomes more commonplace.


People have 1 GBPS connections today and hardly any of them do those things.


I have a gigabit home connection. No I don’t need it, but having to wait less to move data around is nice.

A week ago I downloaded a 4.5 gigabyte photo from the ESA to print and put on my wall, yes all of those pixels mattered. It took closer to 30 seconds than the ten minutes it would have otherwise on a more reasonable “fast” home internet connection. So there’s several minutes of my life I probably would have just sat around waiting for otherwise.

It’s not about big constant needs but making the delay shorter on many smaller things. (i.e. latency over throughput, but latency not in ms for computer networking but seconds and minutes for tasks on a human level)


The world isn't constrained by your imagination :)


Full-resolution VR immersion telepresence with haptic data.


I remember when even most IT people were utterly baffled by the idea that individual households could want a 1 megabit line.


I really doubt that. Video phones were a sci fi thing for a century.


> while a super-HD 1080p one uses 3.8 Mbps up. (And virtually nobody videoconferences from home in 1080p anyways, who wants coworkers to see your skin blemishes in maximum detail?!)

I posit the real reason is because no-one makes a decent webcam for desktops and laptops, not even Apple.


Huh? The Logitech 920 has been around for years, and produces a crystal-clear 1080p image in regular office lighting conditions.

Apple uses a lower-quality webcam in laptops because of the thinness of the lid.


I have a 920. It’s awful when compared to my 12 year-old DSLR’s video-out or my 6 (7?) year-old iPhone 6.

Logitech needs to stop using cheap plastic lenses and excessively compressing video. USB3 has sufficient bandwidth.


That's not my experience at all. Crystal-clear and virtually zero sensor noise in regular light conditions. Maybe you got a bad one...?

Also it's just compressing to MJPEG, which is far higher quality than any subsequent h.264 compression your CPU would ever do. The Logitech Brio transmits raw data over USB3 like you're asking, but the resulting image quality is identical -- I've tested them side-by-side.

(Ancient 920's performed on-camera h.264 compression which was later removed -- if you have one of those, be sure to select the MJPEG stream instead.)


I used to work on a commercial live streaming software app and we routinely had C920s fail from the actual USB wire shorting out. The Brio is overall a much better camera, particularly with the detachable USB-C cable.


https://reincubate.com/support/how-to/why-are-webcams-bad/ is a good article analysing webcams and phone cameras in various ways. The C920 looks pretty bad there.

(HN discussion at the time: https://news.ycombinator.com/item?id=25869460.)


That is a great article, but literally everything he complains about the C920 is with regards to the "auto" settings.

You absolutely do need to manually set and lock exposure, focus, white balance, and ensure you have proper lighting. Once you do that, it works like a charm. And this advice is what pretty much every streaming tutorial on YouTube advises.

The softness issue is particular to the Logitech Brio, which he notes, and which matches my own testing. The C920 doesn't suffer from that.

Yes I wish the software for the camera settings was more intelligent. But ultimately that's a software issue that's fixable with manual settings. The hardware is solid, provided normal lighting conditions.


"Once you work around the problems they aren't problems at all!"


The hardware is there. Google Meet maxes out at 720p.


Nah. Mostly that comes down to thinks like optics, large sensors etc.

The bandwidth needed for a high quality 1080p stream isn’t gonna be much more than one from a shitty 1080p camera. The better image would benefit from less compression / more bits, but it’s not an orders of magnitude difference.


I'd posit it's because most people don't have the upload bandwidth.


3.8Mbps 1080p is hardly 1080p. Decent quality starts at several 10s of Mbps, and high quality 1080p is in the 200-400Mbps range.

Ideally in the future you would A) be broadcasting directly to other participants, not going through zoom’s servers, which might multiply upload bandwidth needs B) be broadcasting at several 10s of Mbps per stream.

I definitely prefer higher bandwidth vconf. I’m not worried about blemishes, but I want to see people’s minute facial expressions better.


> Decent quality starts at several 10s of Mbps, and high quality 1080p is in the 200-400Mbps range.

That's so incorrect I don't even know where to begin.

I regularly record my own "talking head" in 1080p from OBS in h.264 at CRF 14 in "ultrafast", which is extreme crazy quality/space overkill (literally indistinguishable image from original, extremely low CPU usage), and that's an average of 34 Mbps.

I then immediately recompress the resulting enormous file to a still extremely high-quality CRF 21 in "veryslow" to be suitable for editing and subsequent recompression later, and that results in an average of 2.4 Mbps.

For comparison, high-quality 1080p h.264 movie and television releases on torrent sites are usually around 5 Mbps. The difference is that half my frame is a plain/stationary background while movies and TV have complex backgrounds, hence double the bitrate.

I have to ask -- where did you get the idea that "high quality 1080p is in the 200-400Mbps range"? That's off by two entire orders of magnitude. It's almost all the way to raw uncompressed 1080p, which is 1,500 Mbps.


> high-quality 1080p h.264 movie and television releases on torrent sites are usually around 5 Mbps.

BRRips are frequently 30+GB for a 1-2 hour movie. Do the math. You’re off by like 5-8x.

> I have to ask -- where did you get the idea that "high quality 1080p is in the 200-400Mbps range"?

From actually filming and editing video.

Keep in mind that real-time encoders (such as the one in a video recorder or that zoom has to use for reasonable latency) are pretty constrained and will generally achieve worse ratios. If you need to get 3Mbps in real-time on a laptop your only option is basically to quantize the shit out of the video. Releases that can encode slower-than-real-time can use longer GOP, B-frames, etc.

> It's almost all the way to raw uncompressed 1080p, which is 1,500 Mbps.

10bit 444 1080p60 is 3.7Gbps.


> BRRips are frequently

That's something entirely different, not what I was talking about. Also, if a Blu-Ray gives you the space, there's no reason not to use it. That doesn't mean you need it. Which is precisely why the rips that people commonly share don't.

> From actually filming and editing video.

That's for recording frames independently for editing on a professional camera. Not for anything you'd ever transmit live to a consumer in a million years.

> will generally achieve worse ratios

Worse than 200-400Mbps? What are you even talking about? Even an iPhone encodes to just 16 Mbps. Which is definitely not greater than 400.

> basically to quantize the shit out of the video.

Looks fine to me. It doesn't need to be lossless. It just needs to be good. I've never heard anyone complain about an iPhone "quantizing the shit" out of their video. To the contrary, people love iPhone-quality video.

> 10bit 444 1080p60 is 3.7Gbps.

Obviously I'm talking about 8-bit 30fps (444).


Where you go wrong here is that a webcam image is pretty stable. Same background, person moving a little.

Sure it can't be as optimal as a non real-time compressor can, and should therefore need more bandwidth, but high compression ratios on such an image aren't difficult. It's not valid to compare professional film work or even output BluRay encodes to what's required for your typical home video call.


That’s a fair point.


>BRRips are frequently 30+GB for a 1-2 hour movie. Do the math. You’re off by like 5-8x.

No. First, 1-2 hour is a 100% increase. 15GB for a 2hour movie is realistic, with a very high quality (CRF 18 or 19). Which would then be ~15Mbps, considering that there is also audio. Going further does not increase quality, but does increase file size.

It seems to me that you want to transfer an intermediate-codec (or even raw) via the Internet.


What kind of camera is recording 1080p at 200-400Mbps?

For example here https://www.red.com/recording-time if I put in 8K, 60fps and choose the lowest compression ratio, it's still saying it will only use 270Mbps. At 2K the highest bitrate seems to be 41Mbps.


> Decent quality starts at several 10s of Mbps, and high quality 1080p is in the 200-400Mbps range.

What? No. 10 Mbps is already plenty for high quality 1080p with the latest codecs. Even 1080p Blurays only run 20-40 Mbps and they're typically very conservatively compressed.


I think a lot of people might take issue with the statement “plenty high”. This type of statement comes across as similar to when we constantly heard ISPs decrying “25 down is plenty! it’s only pirates who need more than this!” and this was only a few years ago when ISPs were spouting this insanity.

Using your numbers — and we’ll pretend we haven’t already sailed past 1080p as a standard — if we expect better than garbage zoom tier quality, at 20-40 for one stream, most of our isp connections are currently woefully slow for uploads.

Many many many people have others who live in their households, whether they’re roommates, partners, or kids. So if you have 3 people on video chat, suddenly if we want high quality streams, that need goes from 20 and shoots up to 60+. If you have 3 kids on a video chat or streaming for school or play and a parent or two working from home, suddenly you’re up to 100+ upload.

And that’s ignoring that 1080p is becoming old world. I haven’t looked recently, but when buying my last monitor set a year ago, 4k seemed to be by far the most commonly available.

this whole x amount of bandwidth “is plenty” almost always seems to ignore a) real life situations of most families/roommates and b) forward moving technologies.


Where are you coming up with these numbers 20-40 Mbps for a single video stream?

That's insane. Like I mentioned in another comment, high-quality rips of 1080p movies tend to be 5 Mbps, and that's generally with a lot more detail and motion than what your camera is filming at home.

There's really no point in 1080p streaming video above Zoom's 3.8 Mbps. Even that's already overkill which is why almost nobody uses it. While the "20 mbps" you're talking about is beyond the threshold of lossless for human perception, and 40 is beyond that.

And beyond it? There's virtually no use for 4K video in personal communications, nobody wants to see the pores in your nose. It's not about technological limitation, it's about use cases. And a 20 mbps uplink handles the real life situations of most families and roommates just fine, when you use actual real numbers and not the ones you're just making up.


If you’re can’t tell the difference between 20Mbps and 200Mbps you need glasses. This isn’t even some audiophile-tier argument about an extra 60kbps in MP3 or whatever; there are obvious compression artifacts in 20Mbps real-time-encoded CBR video, especially on any kind of remotely high-entropy scene.

> high-quality rips of 1080p movies tend to be 5 Mbps

This is nonsense. A decent 20-30GB BRRip will be at least 25-30Mbps. Also, it’s not a fair comparison because it’s not real time encoding. If you can afford to encode in 5% real-time speed you can get much better compression ratios.


> If you’re can’t tell the difference between 20Mbps and 200Mbps you need glasses.

I think you're the one imagining things here.

Like I said in an earlier comment, I record h.264 in "ultrafast" CRF 14. Every guide that exists says this is below the threshold for indistiguishable from raw footage, and my own eyes agree after extensive side-by-side comparisons. I go as low as 14 because it's overkill. But there's simply no visible difference. And it's an average of 34 Mbps for the kind of stuff I shoot, which as I said is overkill already.

200 Mbps is insanely unnecessarily high, even for real-time encoding on a crappy CPU.


> But there's simply no visible difference.

What are you filming? It sounds like you’re streaming a video of your face; how are you capturing it? If your video is blasted with shot noise from a 10mm^2 sensor through an 4mm wide lens, where “1080p” is more of a recording convention than a faithful description of the resolving power of the system, I can see how 34Mbps might look as good as whatever you see uncompressed.


>If you’re can’t tell the difference between 20Mbps and 200Mbps you need glasses.

Can you provide examples?

https://www.screenshotcomparison.com is suited for this especially.


> especially on any kind of remotely high-entropy scene

Unfortunately my job is pretty boring. Most of my video chats are low entropy.


>That's insane. Like I mentioned in another comment, high-quality rips of 1080p movies tend to be 5 Mbps, and that's generally with a lot more detail and motion than what your camera is filming at home.

I would double that, for at least movies. Not sure if you are going to see an increase in quality in smartphone style cameras.


The family that can equip their kids with production-grade 4k video kit can probably afford 100Gbit business internet service to their house tbh.

4k UHD Netflix stream is ~20Mbps. 1080p is usually about 5-6Mbps, and 99% of people say that it looks great and is all they want.

4k UHD is not needed for effective video chats for most business and personal use. And they wouldn't even need the same as a stream as it's a relatively static image and thus easy to compress.

Your image is typically a little square on the screen too (not the full display size). It is highly unlikely consumers will ever shell out for the camera equipment to create high quality images that need such bandwidth, even if such bandwidth becomes common.

Moore's law will maybe push this all forward in time, but what you describe is a total exaggeration of the current situation.


None of the video streaming software is set up for that, because nobody's internet can upload in that. The best I can do is a 1080p SLR ($350, once) + clicking the HD button in zoom, and most of that is being carried by the better optical system. All the low frame rates, micro stutters and so on still exist, adding to zoom fatigue.


I don’t understand why everyone is supposing that an entire household should be fine with the ability to send at most a single video stream out. What if both my wife and I have separate calls we need to be on? Or my kids want to play Fortnite while I’m uploading data for work? 10mbps up is 1990s-era tech.


I don't think anyone suggests that. But if we compare a "modern" asymmetrical connection such as a 250/50 or a 500/100 or a 600/300 or a 1000/100 then the "one HD stream is sub 5mbps" still means an asymmetric connection fits lots of these streams!

Obviously a 100/10 or 25/5 connection isn't going to cut it. I think really the gist of the article is "You need enough bandwidth in both directions". That's it. If you have a 100mbit, 200mbit or 500mbit or 1000mbit down with a 100+ mbit up, that's less important. "Symmetry" doesn't matter, it's enough bandwidth in both directions that matters.

The fact that "enough in both directions" and "symmetric" have been conflated is that because for a part of history, only symmetric connections did have enough upload bandwidth. With the gigabit and multi-gigabit down bandwidths, there is less need for symmetry so long as you have a fast upload.


On a technical level most of the asymmetric techniques just carve up the spectrum/bandwidth (in Mhz) to give more of it to downstream. Or timeslots or whatever.

I fully agree with the EFF that you need a decent upload, to support the applications of today and next few years. But to go fully symmetric actually means to lower the download in favour of upload, allocating exactly half of the spectrum to each.

So absolutely, once you reach a certain threshold I think most users are going to opt to carve up the spectrum in a way that favours download, just based on typical needs.


A dynamically allocating bandwidth allotment from a fixed budget would probably be a great improvement. 5 seperate zoom call time? More upload! Netflix time? More download, less upload! 3am backup upload time, mostly upload!


BR is different because the encoder doesn’t have to be real-time. It’s also just medium quality. 1080p60 264 or 265 encoded at 200+Mbps is what you get out of decent video cameras.


I believe such high bitrate for 1080p is I-frame only. It's useful for video editing, but not suitable for this context.


Many mid range video cameras support “long GOP” which means P-frames. They still spit out however many hundreds of Mbps you ask for.


You can be conservative and assume that a live-encoding will require double the bitrate for the same quality. So for very high quality 15 Mbps you need 30.


60fps looks disgusting though.


I agree it looks weird for cinema, but I prefer it for video chats. It feels more authentic.


No. Even Blurays don't have that. A high quality rip has somewhat 20Mbps and thats on the high end with H264.


> which isn't anytime soon...

I mean, it depends what you mean by ‘soon’. Infra subsidised today will largely still be there in 20 years.

To be honest, I’m shocked that they’re considering subsidising anything other than fibre at this point.


It's a chicken and egg issue. Webcams are crap partly because nobody has a good upload bandwidth to make it matter. Video software uploads at low bandwidth because everyone's connection is crap, etc, etc.

Also more than one person lives at a household typically. Imagine 3 or 5 zoom calls going at 4mbps simultaneously. That's 20mbps right off the bat, which over saturates most people's upload, so zoom goes conservative by default.

Or imagine in a remote learning, you could see the pupils laptop screen , their desk and their face simultaneously so you can help them with their issue quickly, in 60-30fps HD+ per stream. And then do that for 3 kids + 2 adults and now you have 60mbps. For the adults, simultaneous face, desk & slides would be a form of seamless white boarding at a presentation.

Many applications are limited by shitty internet.


In practice it's not that great. I hosted meetings on Zoom with 4-5mbps up (based on my tests and my recollection since it's been a year-ish). It was fine until my wife connected separately. So 0.6mbps is optimistic... or for people with very high pain thresholds.

I always mentally equate low upload limits with "the ISP wants to make sure you don't run a server". Or "this internet connection is just a way to watch TV". Which I find very sad. You can do a lot more than just consume other people's creative work on the internet. Some creative things don't need much upstream bandwidth (cough cough github) but there are also a lot of things (e.g. sharing videos of your craft projects) that are much more pleasant with more upload speed.


It is worth looking at the obstacles typical end users face.

In my experience, the #1 issue is poor WiFi deployment. Then, users of cable broadband use Ookla for speed tests instead of Fast - Ookla is scammed by Comcast and Frontier.

What if you want fiber on your block? In California, Comcast and AT&T will not touch the poles to protect their incumbency, and PG&E who owns the poles lies and says it’s the ISP’s problem that the poles aren’t ready for fiber. Based on my canvassing of fiber users in the Bay Area, pole improvements came from the cities’ commissions forcing PG&E to do the maintenance work, because it is supposed to anyway.

The EFF does not identify the real antagonist here: public utility companies.


We won't start doing 8K 3D video until its actually possible to do so.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: