Hacker News new | past | comments | ask | show | jobs | submit login

I don't think there's a different level of 'good enough' required here - it's just that we're so early in the development of AR UI that we don't yet know what good enough will look like. The iPhone was the first touch interface that was low enough latency to feel like you were touching and sliding the images around behind the screen. Making those scroll and pinch-to-zoom interactions feel right was critical to its success. Certainly right now these Hololens interactions are not as slick and gratifying as flick-scrolling was in the original mobile Safari. And Apple and other manufacturers have now spent years optimizing screens to make the image seem closer to the surface of the glass and playing with varying success with haptics and pressure sensing and so on.



But people had phones before the iPhone, and they were sluggish as well. Jumping from keypad to touchscreen was a jump large enough to to be botchable (as a few attempts before the iPhone had clearly shown), but tiny nonetheless compared to the jump to a large HMD (from what exactly, btw?). There is no way to casually use the hololens like you might occasionally use a laptop sitting on the side of a paper-centric desk without going fully electronic, it's pretty much an all or nothing commitment. Nobody would tie a contraption like that to their head to use it just for a fraction of the tasks at hand.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: