Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Text is still a large part of the interfaces we use.

We are all highly trained to read text, it seems basic but it is in practice quite abstract.

Text is still best read on a flat surface.

The great innovation I can see with this new Apple device is eye tracking, they have not invented it, but they might have perfected it enough to be useable.

Eyes could be better than a mouse.



Eye tracking is almost certainly more accurate and faster. To the point where we make it an entire game to see who can who can move the mouse to what their eyes are already looking at fastest, in the form of shooter games.


Interesting point — if you no longer need to use a joystick to aim your weapon, how will controllers evolve? Will the second joystick be used for some other function, or will it be replaced by a different type of input method?

It would be funny if controllers evolved to be more like the single-joystick models that we had decades ago, with the joystick on the left and rows of buttons on the right. History doesn't repeat itself, but perhaps it'll rhyme?


One joystick has to exist since there are many videogames where the character is not in focus (barely), thinking of hack and slash with top down camera (the recent diablo 4),the majority of the focus is on where the attacks are going.

In top down bullet hell shooters the player had to aim and shoot in one direction as it walks in another (potentially opposite), so one stick is still needed


Agreed — I'm thinking we'd have just one joystick, to control movement. The second joystick could evolve into some new interface, or devolve into a row of buttons.


PSVR2 already has eye tracking. And unlike the Apple Vision it also has controllers with a stick. Perhaps there are already such games which let you aim with your eyes (assuming you are shooting at something), while you move around with one stick, and the second stick isn't used at all.


Yeah, if their eye tracking plus foveated rendering works as advertised, it could be a huge step forward. I'm really curious how responsive the gesture controls will be too, it was really cool seeing the finger pinches(?) being used as an input method. I wonder if it's specifically designed just for that thing or if it's all built out to track any arbitrary hand gesture accurately. And I wonder what the language/api for describing hand gestures would even look like.


I don’t doubt them. The PSVR2 has both and it supposedly works very well.

That lacks other features of course and must be tethered to a PlayStation 5.

But eye tracking + foveated rendering seems like it’s going to become table stakes in the next few years.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: