Hacker News new | past | comments | ask | show | jobs | submit login

I'm divided; I really want the article to be true, and for everyone to realise what a whole mistake we've been making all along... but, as the legions of us who don't adjust for gamma demonstrate, ignoring it doesn't make the world end?!



Probably most people working in an industry related to graphics have no idea about this and things mostly work. If you get a dark band around something you can tweak it until it looks right or is good enough.

I'm troubleshooting a problem right now where two separate applications blend images together and the result is different. Both results have been shipped to clients for years and fixing it is more of "tidying things up."

But doing it right the first time or being aware helps things look right more often and can enable you to do more complex things without constantly having to tweak the results.


The world wouldn't end without computers either, or in fact, without humans, so this is a moot point... :)

As I mentioned in the article, you can get away with working directly in sRGB in some cases (general image processing, non-photorealistic rendering), but in some other cases it's a must (physically accurate photorealistic rendering). But ignoring it will always produce wrong results. You actually might like those results, but the important point I was trying to get through was to be aware of these potential issues, and then you may ignore them at your own peril.


Ignoring the color space is hardly a world ender for most applications, but it has an effect on quality that can be subtle to drastic depending on what you're doing. For example, arcade games emulated in MAME often have a washed-out look by default because the monitor gamma isn't included(but it can be achieved through user customization).

One thing the article misses is a process or checklist to discover your requirements for color spaces. Characterizing it as a gamma-only problem isn't entirely correct since we also have completely different models of color(e.g. LAB, HSV, YUV) that make tradeoffs for different applications. So, something like:

1. Input data: if it contains gamma data or specifies a color space use that, else assume linear sRGB.

2. Output device: is gamma correction done for you/can you access information about the device capabilities and lighting situation? This can inform viewing conditions. For example, if your smartphone has a light sensor this can be used to adjust gamma as well as backlighting to achieve a near-linear perceptual response. Most apps wouldn't consider doing this of course. If the output is a file or stream determine the most likely use cases and convert as necessary.

3. Internal processing. For each image or signal you process, determine its input color space, and the ideal color space to run it in. Then decide which approximation is an appropriate trade-off for your application(since many color space conversions are compute-intense), and implement conversions as necessary. For many situations, gamma-corrected sRGB is "good enough" hence its emphasis in the article.


> Ignoring the color space is hardly a world ender for most applications, but it has an effect on quality that can be subtle to drastic depending on what you're doing. For example, arcade games emulated in MAME often have a washed-out look by default because the monitor gamma isn't included(but it can be achieved through user customization).

Actually, no emulator does gamma-correct scaling, and when I've tried adding some in the past a lot of games looked quite different and much darker. So different that I don't think most players, who grew up on emus instead of the real thing, would actually accept it.

It was hard enough getting them to accept correct aspect ratios, since NES pixels weren't square either.


The problem with gamma-correctness in emulators is you ALSO have to emulate limited nominal ranges (digitally, 16-235 for Y, 16-240 for UV) on top of doing NTSC or PAL coloring correctly.

Since doing limited ranges on sRGB looks so weird (mainly, 16 is surprisingly bright for being black), some people gamma correct for 1.8 instead of 2.2... which is a hilarious value when you realize that is the gamma value for classic Macs.


I feel pretty good about the "unfiltered" display on an emulator being the original RGB. Seems like adding that 16-235 NTSC YUV conversion on top of 5-bit-per-channel SNES RGB would just make banding even worse, and not really look different?

I mean, I don't like running with the accurately rainbowy and scanliney composite cable mode filters myself, and I thought I cared.


It won't make the banding worse.

To do this naively: Mathematically, 5 bit is 32 values. Multiply each value step by 8, and you get 0, 8 .. 256. Multiply each step by 7, and you get 0, 7 ... 224. Offset each step by 16, you get 16, 17 ... 240.

Also, some emulators do offer chroma subsampling effects just to make certain things look more accurate.

That said, yes, all my emulation days involved "incorrect" SNES rendering: 5 bit per channel SNES RGB, with each step being +32, with no NTSC adjustment.


I didn't really miss it because the article wasn't about colour spaces, just gamma :) It was meant as an introductory article for coders who have never heard about gamma correction. For those people, knowing to convert from sRGB to linear before processing and then back is sufficient in 90+% of the cases. But those are some good points you listed in your comment, but I'd say those are more specialised areas that don't really fall into the everyday image processing category.


Nope, world doesn't end, but scaled images look wrong, some more than others. For more examples see http://www.4p8.com/eric.brasseur/gamma.html




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: