Hacker News new | past | comments | ask | show | jobs | submit login

Given that this is a realtime 3d space squished into the bandwidth of audio, could a variation be done with a worn kinect to feed into headphones an audio waveform version of current surroundings that is capable of being translated by the brain back into 3d. As that sounds like a possible hack for giving some form of vision to the blind.



I'm not so sure. The brain's ability to process and interpret visual information and form an intricate understanding of 3d space seems to me a "hard-wired" process.

Sure, you could convert information encoding 3d space around a user and convey it to them in any number of methods, but that wouldn't leverage the brains very specialized mechanism for processing and interpreting visual information(which I rightly or wrongly conceptualize as a mix of hardware(some analogue of a video coprocessor for the brain) and software(the algorithms the brain uses to interpret the data).



The first link is interesting, it does say the saw signs of activity in the same brain region of a blind echo-locating person as a sighted person who is activity looking at something, but it is hardly a smoking gun.

Perhaps if you are defining "3d" information as something very primitive to the point of being "is there something directly in front of me", but nothing even as close as rich as even the quite crude 3d information being conveyed on the hacked oscilloscope outlined by the OP.

I do see technology eventually giving blind people some sort of "sight" back, but not for quite some time, perhaps after there is some sort of workable neural interface that currently lives solely in sci-fi.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: