Hacker News new | past | comments | ask | show | jobs | submit login

Did NYT have a look at the bokeh samples? They are not that good and in everyday use the flaws will show. Isn't 56mm portrait rather than telephoto?



> Did NYT have a look at the bokeh samples?

I believe the OP is "The New Yorker", not NYT.


That's roughly "normal" or "short tele". Definitely not the best length for portrait, which starts around 70mm imho.


56mm is neither portrait not telephoto.

50mm is the standard street shooter's focal length.


Yes, 50mm is a normal lens, but telephoto is a lens design. It doesn't have anything to do with focal length. Even some wide angle lenses have a telephoto design!

It is true that telephoto designs are usually used on long lenses, however, and for normal lenses on 35mm you don't usually employ telephoto designs.


> Isn't 56mm portrait rather than telephoto?

Telephoto is a lens design, not a characteristic of the focal length. Hell, some wide engle lenses have telephoto designs!


In addition, the bokeh feature only works on portraits.

BTW, since when is there such a thing as a "12 megapixel lens?"


> since when is there such a thing as a "12 megapixel lens?"

Lenses are not perfect and have limits on their resolution. You can often find charts showing the performance of a lens as part of a review.

http://www.whatdigitalcamera.com/x-archive/measuring-lens-re...


> In addition, the bokeh feature only works on portraits.

That's the first I've heard of that. Can you back that up? Seems like it'd be easy enough to get it working on any image with depth to it, so it seems odd that they'd limit it.


While explaining the feature in the presentation, Phil Schiller explained that it did face detection to find the foreground. In addition, the camera mode is called "Portrait" in the Camera app. While it is still speculation until next week, I think it's reasonable to be skeptic about its ability to use produce bokeh on images with no detected faces. Presentation: https://youtu.be/NS0txu_Kzl8?t=1h12m25s


> While it is still speculation until next week, I think it's reasonable to be skeptic about its ability to use produce bokeh on images with no detected faces.

That shouldn't be a limitation in the technology though, just a implementation detail. Emulating DoF would work whether you have a face in the photo or not, it's just easier to lock on to a face as your primary focus.


I'm thinking that if they still let you tap to select what to base the brightness on in this mode, they might be able to extend that to the focus so that it can work with non-face photos. Keep the part you tap on in focus (and anything else at that distance), blur the rest. Though that might not allow you to do independent brightness... Hmmm... Anyway, it's pure hope/speculation at this point.


I'm not so sure.

I was thinking that this would be interesting for product photography but not sure whether it would work. For example, if I'm shooting a row of coffee cups for the side (diagonally) but want the front cup to be in focus and rest in the row gradually going out of focus.

[Edit]

The point being Apple uses algorithm to detect face to 'compute' the depth of field - would the same work for any random object? I'm not sure about this.


stereo depth still has a lot of issues - eg. if you're shooting a white wall or other feature-less object. Not sure if they have better algos but if you use an off-the-shelf stereo package from OpenCV or ROS there are lots of holes in the depth map even in ideal cases.

They probably focus on the portrait application because it has the least amount of corner cases.


Thanks. I don't watch the presentations anymore, so I hadn't heard about that. I think you're right that it's reasonable to be skeptical, but it may also just prove to be an implementation detail. But with the info we have now, yup, sound like it's just for portraits.


Thanks for clarifying that. I was slightly irritated as the article name-dropped bokeh but then went on raving about the processor, not coming back to bokeh to pick up the dangling reference. So the iPhone just blurs what it considers the background in post-processing? This sounds like it just might work if done right. I think I'd still wish for manual focus though :)


Given a perfect depth map there is no reason lens blur can't be replicated exactly. They probably just chose some bad photos for the demo, or possibly there's some placebo effect happening.

Any flaws aught to be in the depth map generation, so things being inappropriately blurred.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: