> If you’re can’t tell the difference between 20Mbps and 200Mbps you need glasses.
I think you're the one imagining things here.
Like I said in an earlier comment, I record h.264 in "ultrafast" CRF 14. Every guide that exists says this is below the threshold for indistiguishable from raw footage, and my own eyes agree after extensive side-by-side comparisons. I go as low as 14 because it's overkill. But there's simply no visible difference. And it's an average of 34 Mbps for the kind of stuff I shoot, which as I said is overkill already.
200 Mbps is insanely unnecessarily high, even for real-time encoding on a crappy CPU.
What are you filming? It sounds like you’re streaming a video of your face; how are you capturing it? If your video is blasted with shot noise from a 10mm^2 sensor through an 4mm wide lens, where “1080p” is more of a recording convention than a faithful description of the resolving power of the system, I can see how 34Mbps might look as good as whatever you see uncompressed.
I think you're the one imagining things here.
Like I said in an earlier comment, I record h.264 in "ultrafast" CRF 14. Every guide that exists says this is below the threshold for indistiguishable from raw footage, and my own eyes agree after extensive side-by-side comparisons. I go as low as 14 because it's overkill. But there's simply no visible difference. And it's an average of 34 Mbps for the kind of stuff I shoot, which as I said is overkill already.
200 Mbps is insanely unnecessarily high, even for real-time encoding on a crappy CPU.