Hacker News new | past | comments | ask | show | jobs | submit login

Thanks for breaking it down!

It seems like this was a necessity to allow for a larger range of representable colours in a single byte. Is this system largely a carry over from more constrained hardware days, or is it still appropriate for modern hardware? Like, we could represent colors with a few more bytes and toss out the gamma encoding to get back to the same level of control. Or am I overlooking more fundamental reasons why this is a good abstraction?




Still appropriate for modern hardware. Despite hardware getting faster, display resolutions and refresh rates have also increased to eat the memory bandwidth gains. Especially so at VR resolutions.

Modern rendering engines will use a mixture of 8bit, 10bit, and 16bit per color channel buffers - the final 'lit' color buffer is where you will generally write fp16 (8bytes per pixel) linear HDR color values, if you can spare the performance hit, as this is where you need the highest range.


It's a good abstraction, because most of the time you care about perceived brightness. With gamma encoding, 255 looks twice as bright as 127. You are going to need to deal with that perceptual power law at some point no matter what representation you use, so you may as well pick the efficient one. It's probably not a good tradeoff to make everyone deal with a color scale that doesn't look right, and is inefficient, just to make intensity operations linear so that unwary programmers don't trip.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: