Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Watching a rodeo on CBS right now, I see the video bitrate ranging from around 500kbps to over 20 Mbps. The bitrate does not seem to correlate with anything visible to me, such as the amount of motion in the picture or the video quality.


Apparently Puffer is actually a study that is testing streaming algorithms:

https://puffer.stanford.edu/terms/


20mbps is raw DTV over the air, that is how much bandwidth my tuner detects while watching tv from my HDHomerun. When I encode using plex or something similar, my bandwidth can get as low as 500kbps but usually averages 1mbps. Below 500 and artifacts become noticeable. When I increase cpu for x264 encoding, I can get 384kbps comfortably (at the expense of superheating that box)


Bursty buffering perhaps? I haven't tried it myself.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: