Waiting for the load to drop so I can try to get a preorder!
Question: How can you render the "other side" of the hand at the 51+ second mark? If this is indeed possible, that's quite a remarkable technology you have.
I suspect it's an illusion and that does not represent their true 3d point cloud. It's probably a skinned 3d mesh of a hand which is then moving in sync with the detected hand.
Their webpage suggests they can handle multiple devices connected to the one PC, so I was wondering whether they had another sensor out of shot above the scene pointing downwards.
Actually - scrap that. A better hunch is that it's not structured light at all, but actually an electric field sensor. See this Quora answer (disclaimer, by someone who "knows shit all about this"): http://www.quora.com/Leap-Motion/What-is-the-technology-behi...
What is the API of the dev kit like? Does it give the programmer events translated into hand kinematics (like, 'right index finger pointing forward') or is it just a cloud of points?
How will you license the technology for others to reproduce? Will you be aggressive at licensing trying to profit , or will you be permissive and partner with other manufactures to make this truly ubiquitous?
Please give us some technical details to satisfy our curiosity. Perhaps I missed it on the website, but I didn't notice any mention of how it functions (IR? Sound waves?), what sort of range of distances it works in, etc.
Pre-orders ($70) only ship domestic (for now) around winter.
20,000 dev kits are being made. We want to ensure this tech becomes ubiquitous.
We're getting slammed with launch response. But if you guys have questions, we'll try to answer them here shortly.
-Chris Community Builder