Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

From the article:

"He concedes that there are challenges: improving the optical performance of the elements; suppressing spillover effects between different signals in the device; and honing the algorithms that calibrate the camera’s performance."

The line of "honing the algorithms" is the complexity orbital-decay is referencing. These types of sensors do need computationally expensive reconstruction to generate images we're used to seeing with traditional optics and sensors currently found in many consumer devices. The filtering and focusing work the lenses do still needs to happen. These sensors essentially rely on complex math to replace the finely ground glass.



Those algorithms are already very well understood from radar and sonar, though. It's not like he's starting from scratch.

And I'd take issue with the characterisation that they need "complex math to replace the finely ground glass" - what replaces the glass is the analogue photon detection, delay, and amplification channel on the front end. My suspicion is that the only "complex math" is done calculating the delays before capturing the image, not doing the reconstruction (again, unless I've missed something unique about moving from GHz to THz).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: