Hacker News new | past | comments | ask | show | jobs | submit login

I can also definitely detect a significant distinction, but it doesn't feel more "real" to me -- in fact, I experience it as falling into a kind of uncanny valley between video-realistic and near-realistic. Don't much care for it.



A lot of HDTVs have motion interpolation, colour enhancement, and noise reduction; all of which absolutely ruin movies.

I suggest turning these off(In my case, all of them were on by default, and some of them needed to be manually toggled on every start for a month before it magically decided to default correctly).


Perhaps you watched it on a system with the horrible, high-framerate interpolation switched on? Not everyone is sensitive to 24fps (traditional film) vs. 30fps (looks like Handicam output) and the effects of TV/processor motion enhancement.


This is because traditional film cameras had horrible images compared to what your eye can see. This resulted in various techniques such as movie make up and bright lighting.

Now that the camera is much better, all of those compensation techniques are now making the image look fake. The lighting is too bright, the makeup too heavy.


I'm not convinced that's the whole story. Even old movies converted to digital fall into the uncanny valley. Try watching bluray Godfather for example. Looks staged.


You might want to check the settings on your Blu-ray playback system, because the Godfather Blu-ray is one of the more "filmic" transfers I've seen, preserving the grain of the film print rather than attempting to "clean it up" with digital noise reduction, and it should play back at the original 24fps frame rate on any system that supports it.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: