Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

How well does the quality of a videoconference scale with the number of call participants?


That's not a comparable question. Peer-to-peer bandwidth requirements for a single user increases linearly with the number of participants while a client-server model is essentially flat regardless of the number of participants.


In Jami, for video-conferences all clients are not directly connected to each others. This can't work, you can't ask everybody to stream and receives each video streams. Also, you can't ask each client to mix the streams together for the client.

You can see the video conferences in Jami more like a mesh between all participants of the call. Some nodes will mix the streams together (for now it's the device which receives the calls to merge together, they will need both good CPU & bandwith for sure) and the other nodes will only receives the mixed stream (and sends their stream)

So, for the host you can count 1Mbps/participant (to get a good h264 video stream) and the receiver the CPU & bandwith will be the same for a conference with 15 participants or 2.

We got a lot of conference with 15 people hosted by a x220 (and a lot with a P51, but it's a bit more powerful).


> Peer-to-peer bandwidth requirements for a single user increases linearly with the number of participants while a client-server model is essentially flat regardless of the number of participants.

True if the server integrates all incoming streams into one. However, in that case, CPU requirements on the server also grow on linear basis.

An interesting solution would be one where each client downscales the resolution of their stream to the actual space it would take on target computers. That way the per-user bandwidth would remain constant as smaller uploads would probably be enough to offset overhead of having multiple incoming streams (of course, each incoming stream already downscaled by sender and thus requiring only a portion bandwidth).


That can’t possibly be true. Even if the server integrates all in bound streams into one, doing so takes a non-negligible amount of additional time per user, surely.


The person you replied to was talking about bandwidth, not CPU load.

Imagine a user that only has enough bandwidth for a single stream. If there was a server integrating all other user streams into one, that user would be able to watch it without much problem.

Unless, the clients are built in such way that they down-scale the resolution to part of the screen it would occupy on target computers (based on number of participants) before sending video data. Is any software doing this currently?

> Even if the server integrates all in bound streams into one, doing so takes a non-negligible amount of additional time per user, surely.

Integration has to happen at one point. Whether a server does it or each user's computer.


> Unless, the clients are built in such way that they down-scale the resolution to part of the screen it would occupy on target computers (based on number of participants) before sending video data. Is any software doing this currently?

It's the obvious thing to do - I can't imagine why they wouldn't.


Well, both questions question the quality of videoconferences in regard to the number of participants, so I would say the questions are very comparable.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: