Hacker News new | past | comments | ask | show | jobs | submit login

With all of the processing wizardry that goes into phone cameras these days, gets me thinking: will we reach a point where cameras are digital paintbrushes that construct a human-pleasing image out of hints from the real world, rather than tools that capture the real world?

There are probably plenty of people that will pay for a phone camera that makes them look less like themselves if it makes them look more attractive in subtle ways.




Cameras have never "captured the real world" in the sense of approximating what we see.

We never see static images of the world; we have a rather narrow field of vision and move our eyes constantly. So whatever we perceive of the world is basically a function of our attention. Remember those videos where volunteers perform some complicated collective task and fail to see a gorilla passing right between them.

Photos in contrast are static (the photographer has to hint towards the attention-point structure he thinks that reflects what attracted his eye in first place by framing and lighting) and two-dimensional (so the photographer has to select a focus structure that, again, hints at the attention-point structure).

This is why photos of crowds (think of the masses on the streets after the earthquake in Mexico) are either art-level pieces that basically employ the pictorial language of classical paintings; or don't seem to represent the story at all.


Photos have never been about capturing the "real world". Photos have always been about what the photographer wants you to see.

And this isn't new, in fact it's a phenomenon that's happening ever since Neanderthals and their cave paintings.


That sounds like a deep comment but that's about it.

I'm sure photography has served many uses, and capturing the real world is certainly one of them.


It’s just as deep as the parent.

When have photos ever been about accuracy? When they were black and white? Or when Polaroid had problems with furniture and colored skin?


I think the parent's comment is insightful. Modern consumer cameras and monitors purposefully mess the color profile to have more saturated colors and better contrast than what you see with the naked eye. Not due to a technical limitation but because it makes the image more pleasing to the eyes.

I don't take a lot of photographs so I don't care too much about cameras but not being able to display a decent image on a modern TV without having it mess it up on purpose is really annoying me. Even disabling all the post processing doesn't really work anymore. It's like those CD masters with all the sound pushed near saturation to make it sound louder, it's a cheap marketing gimmick.


Every time a friend of mine bought a phone in a country like China or Korea, the camera app had built-in filters to smooth the face and prettify the eyes. Often times these were also the default setting. So I think we are already at that stage in some parts of the world


Samsung did this with the Note 3, and I suspect other flagships around this time.

The rear-facing camera would default to their 'Auto' mode, switch to the front-facing and you have to switch out of something like 'Beauty' mode with tons of post-processing for selfies.


Selfies are inherently at a disadvantage, because shots are very close range with a wide angle lens exaggerate depth. Portrait photography is usually done with longer lenses, which you can't fake computationally so easily. So I think some of the selfie postprocessing is trying to make up for that.


Also most people are "ugly" up close. As in, we see micro features that distract from the macro.


Here's a conspiracy theory: what if some/most phones do this to some level noticeable only to the subconscious without any user knowledge of it happening? Maybe as some sort of marketing ploy to increase positive word of mouth.

"Hey $FRIEND, this new phone is great! I can't believe how good it makes me look. You can really see my eyes. Check it out!"


Like size inflation in clothes? "I can fit into a size x in these pants, but x+2 in those. The size x pants look way better."


In some ways, it’s been like that for a few years now. Modern mobile phone camera systems will take multiple individual pictures, align them for any movement, average out temporal noise if there was no motion, create an HDR image from multiple frames captured at different exposure settings, detect different sub-scenes (for example an indoor room and an outdoor scene viewed through a window) and apply individual exposure corrections, identify and isolate faces and apply pleasing lighting corrections, and much more. A big portion of the silicon area of the iPhone SoC is dedicated purely to this image processing pipeline.


Exactly. Image compensation and the concoction of an HDR image are the visual equivalent of a loudness enhanced audio file.

Loudness enhancement reduces the dynamic range of the source, and HDR enhancement also removes information to allow different information to be more easily perceived by the human.


> will we reach a point where cameras are digital paintbrushes that construct a human-pleasing image out of hints from the real world, rather than tools that capture the real world?

We're all stuck in Plato's cave. Aperture is just one thing you can use to change how a photo looks, and this has always been under the control of photographers:

https://qph.ec.quoracdn.net/main-qimg-e17ac8adb8b21eaa43d409...


I think we're already well past that point to be honest. One of the things people laud about the Google Pixel cameras is their HDR mode, and that's really not just capturing the real world as it is.


I hate to break this to you but most photos on social media have some kind of enhancement, whether that’s simply makeup, digital contorting or simply taking many many photos until the one that appears looks nothing like the subject. The new Apple Portrait mode will be another example of this.


I had a point-and-shoot years ago with a sunset mode that took really cool photos of sunsets. I little too cool. So I tried it well before sunset, and it produced all the vibrant colors with the sun halfway up the sky.


I think we are almost there already, the difference between my burner phone and a Pixel XL (previous 'best' camera-phone) is such that I find the ground beneath my feet as viewed through the lens of a Pixel XL to be much more beautiful than in real life.

My camera points outwards rather than in, sadly selfies are not helped in my personal situation.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: